Mar 08 03:09:20.841054 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:09:21.531570 master-0 kubenswrapper[4048]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:09:21.531570 master-0 kubenswrapper[4048]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:09:21.531570 master-0 kubenswrapper[4048]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:09:21.531570 master-0 kubenswrapper[4048]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:09:21.531570 master-0 kubenswrapper[4048]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:09:21.533041 master-0 kubenswrapper[4048]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:09:21.533041 master-0 kubenswrapper[4048]: I0308 03:09:21.532590 4048 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:09:21.540403 master-0 kubenswrapper[4048]: W0308 03:09:21.540343 4048 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:09:21.540403 master-0 kubenswrapper[4048]: W0308 03:09:21.540383 4048 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:09:21.540403 master-0 kubenswrapper[4048]: W0308 03:09:21.540393 4048 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:09:21.540403 master-0 kubenswrapper[4048]: W0308 03:09:21.540401 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:09:21.540403 master-0 kubenswrapper[4048]: W0308 03:09:21.540410 4048 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540421 4048 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540429 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540438 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540446 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540456 4048 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540464 4048 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540473 4048 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540506 4048 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540514 4048 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540521 4048 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540529 4048 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540538 4048 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540546 4048 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540553 4048 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540561 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540568 4048 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540577 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540585 4048 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540592 4048 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:09:21.540704 master-0 kubenswrapper[4048]: W0308 03:09:21.540600 4048 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540608 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540616 4048 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540641 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540649 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540657 4048 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540664 4048 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540672 4048 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540680 4048 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540689 4048 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540700 4048 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540710 4048 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540720 4048 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540731 4048 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540740 4048 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540748 4048 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540757 4048 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540764 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540772 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540810 4048 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:09:21.541699 master-0 kubenswrapper[4048]: W0308 03:09:21.540820 4048 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540833 4048 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540842 4048 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540856 4048 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540870 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540884 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540895 4048 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540908 4048 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540920 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540932 4048 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540942 4048 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540951 4048 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540961 4048 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540970 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540984 4048 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.540993 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.541003 4048 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.541012 4048 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.541022 4048 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.541032 4048 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:09:21.542597 master-0 kubenswrapper[4048]: W0308 03:09:21.541040 4048 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541049 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541057 4048 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541065 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541072 4048 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541080 4048 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541088 4048 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: W0308 03:09:21.541098 4048 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541919 4048 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541948 4048 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541963 4048 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541974 4048 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541988 4048 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.541998 4048 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542010 4048 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542021 4048 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542031 4048 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542040 4048 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542049 4048 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542058 4048 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542069 4048 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542078 4048 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:09:21.543438 master-0 kubenswrapper[4048]: I0308 03:09:21.542089 4048 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542098 4048 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542107 4048 flags.go:64] FLAG: --cloud-config="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542117 4048 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542126 4048 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542139 4048 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542147 4048 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542157 4048 flags.go:64] FLAG: --config-dir="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542166 4048 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542176 4048 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542187 4048 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542198 4048 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542207 4048 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542217 4048 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542226 4048 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542235 4048 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542244 4048 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542254 4048 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542263 4048 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542274 4048 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542284 4048 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542293 4048 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542301 4048 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542311 4048 flags.go:64] FLAG: --enable-server="true" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542320 4048 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:09:21.544399 master-0 kubenswrapper[4048]: I0308 03:09:21.542332 4048 flags.go:64] FLAG: --event-burst="100" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542341 4048 flags.go:64] FLAG: --event-qps="50" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542350 4048 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542359 4048 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542368 4048 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542379 4048 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542388 4048 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542397 4048 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542406 4048 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542416 4048 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542425 4048 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542434 4048 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542443 4048 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542452 4048 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542461 4048 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542470 4048 flags.go:64] FLAG: --feature-gates="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542508 4048 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542518 4048 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542528 4048 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542538 4048 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542548 4048 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542557 4048 flags.go:64] FLAG: --help="false" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542566 4048 flags.go:64] FLAG: --hostname-override="" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542575 4048 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542584 4048 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:09:21.545607 master-0 kubenswrapper[4048]: I0308 03:09:21.542593 4048 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542603 4048 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542613 4048 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542622 4048 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542633 4048 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542643 4048 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542652 4048 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542661 4048 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542671 4048 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542679 4048 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542689 4048 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542699 4048 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542708 4048 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542716 4048 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542725 4048 flags.go:64] FLAG: --lock-file="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542734 4048 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542745 4048 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542754 4048 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542768 4048 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542777 4048 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542787 4048 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542796 4048 flags.go:64] FLAG: --logging-format="text" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542805 4048 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542815 4048 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542824 4048 flags.go:64] FLAG: --manifest-url="" Mar 08 03:09:21.546720 master-0 kubenswrapper[4048]: I0308 03:09:21.542833 4048 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542846 4048 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542855 4048 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542866 4048 flags.go:64] FLAG: --max-pods="110" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542875 4048 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542884 4048 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542894 4048 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542903 4048 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542912 4048 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542921 4048 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542930 4048 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542951 4048 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542960 4048 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542970 4048 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542979 4048 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.542988 4048 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543002 4048 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543011 4048 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543021 4048 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543030 4048 flags.go:64] FLAG: --port="10250" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543039 4048 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543049 4048 flags.go:64] FLAG: --provider-id="" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543058 4048 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543067 4048 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:09:21.547814 master-0 kubenswrapper[4048]: I0308 03:09:21.543077 4048 flags.go:64] FLAG: --register-node="true" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543085 4048 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543095 4048 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543109 4048 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543118 4048 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543127 4048 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543136 4048 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543148 4048 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543159 4048 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543168 4048 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543177 4048 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543186 4048 flags.go:64] FLAG: --runonce="false" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543195 4048 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543204 4048 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543213 4048 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543222 4048 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543231 4048 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543241 4048 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543250 4048 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543259 4048 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543268 4048 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543277 4048 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543286 4048 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543295 4048 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543304 4048 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:09:21.549124 master-0 kubenswrapper[4048]: I0308 03:09:21.543313 4048 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543322 4048 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543335 4048 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543344 4048 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543353 4048 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543364 4048 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543373 4048 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543382 4048 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543391 4048 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543406 4048 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543416 4048 flags.go:64] FLAG: --v="2" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543427 4048 flags.go:64] FLAG: --version="false" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543438 4048 flags.go:64] FLAG: --vmodule="" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543450 4048 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: I0308 03:09:21.543461 4048 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543684 4048 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543696 4048 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543705 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543713 4048 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543722 4048 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543730 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543738 4048 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543747 4048 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:09:21.550356 master-0 kubenswrapper[4048]: W0308 03:09:21.543754 4048 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543765 4048 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543775 4048 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543784 4048 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543793 4048 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543801 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543809 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543817 4048 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543825 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543833 4048 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543841 4048 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543848 4048 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543856 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543864 4048 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543872 4048 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543879 4048 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543887 4048 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543895 4048 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543906 4048 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543914 4048 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:09:21.551383 master-0 kubenswrapper[4048]: W0308 03:09:21.543922 4048 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543932 4048 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543943 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543952 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543960 4048 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543967 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543975 4048 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543983 4048 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543991 4048 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.543999 4048 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544007 4048 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544015 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544022 4048 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544033 4048 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544043 4048 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544052 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544062 4048 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544071 4048 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:09:21.552539 master-0 kubenswrapper[4048]: W0308 03:09:21.544079 4048 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544089 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544098 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544106 4048 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544113 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544122 4048 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544130 4048 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544138 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544145 4048 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544153 4048 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544160 4048 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544168 4048 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544185 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544193 4048 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544201 4048 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544208 4048 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544216 4048 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544224 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544232 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544239 4048 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:09:21.553385 master-0 kubenswrapper[4048]: W0308 03:09:21.544248 4048 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: W0308 03:09:21.544257 4048 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: W0308 03:09:21.544264 4048 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: W0308 03:09:21.544272 4048 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: W0308 03:09:21.544280 4048 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: W0308 03:09:21.544287 4048 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:09:21.554311 master-0 kubenswrapper[4048]: I0308 03:09:21.544311 4048 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:09:21.555026 master-0 kubenswrapper[4048]: I0308 03:09:21.554934 4048 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:09:21.555026 master-0 kubenswrapper[4048]: I0308 03:09:21.555012 4048 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:09:21.555156 master-0 kubenswrapper[4048]: W0308 03:09:21.555145 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555161 4048 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555171 4048 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555181 4048 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555191 4048 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555199 4048 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555207 4048 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:09:21.555211 master-0 kubenswrapper[4048]: W0308 03:09:21.555216 4048 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555225 4048 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555235 4048 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555243 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555250 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555259 4048 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555266 4048 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555275 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555283 4048 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555291 4048 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555299 4048 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555306 4048 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555315 4048 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555323 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555331 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555338 4048 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555346 4048 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555354 4048 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555365 4048 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:09:21.555580 master-0 kubenswrapper[4048]: W0308 03:09:21.555379 4048 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555388 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555397 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555406 4048 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555415 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555423 4048 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555431 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555438 4048 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555448 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555456 4048 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555464 4048 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555471 4048 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555480 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555513 4048 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555522 4048 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555530 4048 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555538 4048 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555546 4048 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555553 4048 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555561 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:09:21.556415 master-0 kubenswrapper[4048]: W0308 03:09:21.555569 4048 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555577 4048 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555585 4048 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555593 4048 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555601 4048 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555610 4048 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555618 4048 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555626 4048 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555637 4048 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555646 4048 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555656 4048 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555667 4048 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555677 4048 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555687 4048 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555695 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555706 4048 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555715 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555723 4048 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555731 4048 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555739 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:09:21.557948 master-0 kubenswrapper[4048]: W0308 03:09:21.555748 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.555755 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.555766 4048 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.555775 4048 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.555786 4048 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.555797 4048 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: I0308 03:09:21.555811 4048 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556038 4048 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556053 4048 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556062 4048 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556072 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556080 4048 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556087 4048 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556095 4048 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556103 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:09:21.559121 master-0 kubenswrapper[4048]: W0308 03:09:21.556111 4048 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556119 4048 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556126 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556134 4048 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556142 4048 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556150 4048 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556158 4048 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556166 4048 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556173 4048 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556181 4048 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556188 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556197 4048 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556205 4048 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556212 4048 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556220 4048 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556228 4048 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556236 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556244 4048 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556251 4048 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:09:21.559847 master-0 kubenswrapper[4048]: W0308 03:09:21.556262 4048 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556273 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556283 4048 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556294 4048 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556302 4048 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556312 4048 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556326 4048 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556335 4048 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556343 4048 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556352 4048 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556359 4048 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556367 4048 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556375 4048 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556383 4048 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556391 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556398 4048 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556406 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556416 4048 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556426 4048 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:09:21.560720 master-0 kubenswrapper[4048]: W0308 03:09:21.556434 4048 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556443 4048 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556451 4048 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556459 4048 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556467 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556475 4048 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556503 4048 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556512 4048 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556519 4048 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556527 4048 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556535 4048 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556543 4048 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556550 4048 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556558 4048 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556566 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556573 4048 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556582 4048 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556590 4048 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556601 4048 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556611 4048 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:09:21.561637 master-0 kubenswrapper[4048]: W0308 03:09:21.556621 4048 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: W0308 03:09:21.556631 4048 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: W0308 03:09:21.556640 4048 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: W0308 03:09:21.556651 4048 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: W0308 03:09:21.556660 4048 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: W0308 03:09:21.556673 4048 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: I0308 03:09:21.556690 4048 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:09:21.562664 master-0 kubenswrapper[4048]: I0308 03:09:21.558525 4048 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:09:21.567065 master-0 kubenswrapper[4048]: I0308 03:09:21.567006 4048 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 03:09:21.569058 master-0 kubenswrapper[4048]: I0308 03:09:21.569012 4048 server.go:997] "Starting client certificate rotation" Mar 08 03:09:21.569058 master-0 kubenswrapper[4048]: I0308 03:09:21.569057 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:09:21.571602 master-0 kubenswrapper[4048]: I0308 03:09:21.571537 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:09:21.597053 master-0 kubenswrapper[4048]: I0308 03:09:21.596935 4048 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:09:21.599888 master-0 kubenswrapper[4048]: I0308 03:09:21.599830 4048 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:09:21.602069 master-0 kubenswrapper[4048]: E0308 03:09:21.601977 4048 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:21.623108 master-0 kubenswrapper[4048]: I0308 03:09:21.622700 4048 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:09:21.629564 master-0 kubenswrapper[4048]: I0308 03:09:21.629516 4048 log.go:25] "Validated CRI v1 image API" Mar 08 03:09:21.632665 master-0 kubenswrapper[4048]: I0308 03:09:21.632564 4048 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:09:21.638239 master-0 kubenswrapper[4048]: I0308 03:09:21.638172 4048 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 87790f63-c01f-464b-b8aa-2380aaf22629:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:09:21.638239 master-0 kubenswrapper[4048]: I0308 03:09:21.638223 4048 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 08 03:09:21.666788 master-0 kubenswrapper[4048]: I0308 03:09:21.666278 4048 manager.go:217] Machine: {Timestamp:2026-03-08 03:09:21.662814079 +0000 UTC m=+0.628286720 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b3a4f7075cb34fef92c3bca0876fb6a9 SystemUUID:b3a4f707-5cb3-4fef-92c3-bca0876fb6a9 BootID:ab1d3f01-9ab7-4687-a25d-e07ad2358a90 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3a:b1:eb Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:76:56:79 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:06:cd:ac:c4:de:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:09:21.666788 master-0 kubenswrapper[4048]: I0308 03:09:21.666726 4048 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:09:21.667028 master-0 kubenswrapper[4048]: I0308 03:09:21.666932 4048 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:09:21.668906 master-0 kubenswrapper[4048]: I0308 03:09:21.668855 4048 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:09:21.669259 master-0 kubenswrapper[4048]: I0308 03:09:21.669192 4048 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:09:21.669621 master-0 kubenswrapper[4048]: I0308 03:09:21.669251 4048 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:09:21.669714 master-0 kubenswrapper[4048]: I0308 03:09:21.669649 4048 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:09:21.669714 master-0 kubenswrapper[4048]: I0308 03:09:21.669671 4048 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:09:21.669837 master-0 kubenswrapper[4048]: I0308 03:09:21.669809 4048 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:09:21.669893 master-0 kubenswrapper[4048]: I0308 03:09:21.669858 4048 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:09:21.670101 master-0 kubenswrapper[4048]: I0308 03:09:21.670061 4048 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:09:21.670276 master-0 kubenswrapper[4048]: I0308 03:09:21.670239 4048 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:09:21.674645 master-0 kubenswrapper[4048]: I0308 03:09:21.674593 4048 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:09:21.674645 master-0 kubenswrapper[4048]: I0308 03:09:21.674636 4048 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:09:21.674809 master-0 kubenswrapper[4048]: I0308 03:09:21.674681 4048 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:09:21.674809 master-0 kubenswrapper[4048]: I0308 03:09:21.674704 4048 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:09:21.674809 master-0 kubenswrapper[4048]: I0308 03:09:21.674734 4048 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:09:21.680121 master-0 kubenswrapper[4048]: I0308 03:09:21.680039 4048 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:09:21.681959 master-0 kubenswrapper[4048]: W0308 03:09:21.681837 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:21.682057 master-0 kubenswrapper[4048]: W0308 03:09:21.681910 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:21.682057 master-0 kubenswrapper[4048]: E0308 03:09:21.681995 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:21.682057 master-0 kubenswrapper[4048]: E0308 03:09:21.682028 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:21.682594 master-0 kubenswrapper[4048]: I0308 03:09:21.682550 4048 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:09:21.682913 master-0 kubenswrapper[4048]: I0308 03:09:21.682876 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:09:21.682972 master-0 kubenswrapper[4048]: I0308 03:09:21.682916 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:09:21.682972 master-0 kubenswrapper[4048]: I0308 03:09:21.682931 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:09:21.682972 master-0 kubenswrapper[4048]: I0308 03:09:21.682944 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:09:21.682972 master-0 kubenswrapper[4048]: I0308 03:09:21.682960 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.682996 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683010 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683022 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683037 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683066 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683111 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:09:21.683281 master-0 kubenswrapper[4048]: I0308 03:09:21.683133 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:09:21.684227 master-0 kubenswrapper[4048]: I0308 03:09:21.684185 4048 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:09:21.686174 master-0 kubenswrapper[4048]: I0308 03:09:21.686125 4048 server.go:1280] "Started kubelet" Mar 08 03:09:21.687927 master-0 kubenswrapper[4048]: I0308 03:09:21.687602 4048 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:09:21.688137 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:09:21.689576 master-0 kubenswrapper[4048]: I0308 03:09:21.688140 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:21.689576 master-0 kubenswrapper[4048]: I0308 03:09:21.688185 4048 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:09:21.689576 master-0 kubenswrapper[4048]: I0308 03:09:21.688319 4048 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:09:21.689576 master-0 kubenswrapper[4048]: I0308 03:09:21.689117 4048 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:09:21.694451 master-0 kubenswrapper[4048]: I0308 03:09:21.694183 4048 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:09:21.700995 master-0 kubenswrapper[4048]: E0308 03:09:21.698976 4048 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abef4bd9b2a85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,LastTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:21.701232 master-0 kubenswrapper[4048]: I0308 03:09:21.701178 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:09:21.701272 master-0 kubenswrapper[4048]: I0308 03:09:21.701245 4048 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:09:21.701445 master-0 kubenswrapper[4048]: I0308 03:09:21.701402 4048 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:09:21.701502 master-0 kubenswrapper[4048]: I0308 03:09:21.701446 4048 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:09:21.701532 master-0 kubenswrapper[4048]: E0308 03:09:21.701480 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:21.701609 master-0 kubenswrapper[4048]: I0308 03:09:21.701591 4048 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:09:21.701761 master-0 kubenswrapper[4048]: I0308 03:09:21.701712 4048 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:09:21.701761 master-0 kubenswrapper[4048]: I0308 03:09:21.701759 4048 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:09:21.702392 master-0 kubenswrapper[4048]: W0308 03:09:21.702338 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:21.702527 master-0 kubenswrapper[4048]: E0308 03:09:21.702508 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:21.703080 master-0 kubenswrapper[4048]: E0308 03:09:21.702959 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:09:21.703129 master-0 kubenswrapper[4048]: I0308 03:09:21.703079 4048 factory.go:55] Registering systemd factory Mar 08 03:09:21.703129 master-0 kubenswrapper[4048]: I0308 03:09:21.703115 4048 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:09:21.704258 master-0 kubenswrapper[4048]: I0308 03:09:21.704210 4048 factory.go:153] Registering CRI-O factory Mar 08 03:09:21.704258 master-0 kubenswrapper[4048]: I0308 03:09:21.704253 4048 factory.go:221] Registration of the crio container factory successfully Mar 08 03:09:21.704826 master-0 kubenswrapper[4048]: I0308 03:09:21.704786 4048 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:09:21.704870 master-0 kubenswrapper[4048]: I0308 03:09:21.704833 4048 factory.go:103] Registering Raw factory Mar 08 03:09:21.705106 master-0 kubenswrapper[4048]: I0308 03:09:21.704877 4048 manager.go:1196] Started watching for new ooms in manager Mar 08 03:09:21.706067 master-0 kubenswrapper[4048]: I0308 03:09:21.706033 4048 manager.go:319] Starting recovery of all containers Mar 08 03:09:21.708865 master-0 kubenswrapper[4048]: E0308 03:09:21.708809 4048 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 08 03:09:21.732159 master-0 kubenswrapper[4048]: I0308 03:09:21.732102 4048 manager.go:324] Recovery completed Mar 08 03:09:21.743219 master-0 kubenswrapper[4048]: I0308 03:09:21.743167 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.745040 master-0 kubenswrapper[4048]: I0308 03:09:21.744997 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.745040 master-0 kubenswrapper[4048]: I0308 03:09:21.745033 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.745040 master-0 kubenswrapper[4048]: I0308 03:09:21.745045 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.745886 master-0 kubenswrapper[4048]: I0308 03:09:21.745844 4048 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:09:21.745886 master-0 kubenswrapper[4048]: I0308 03:09:21.745875 4048 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:09:21.745983 master-0 kubenswrapper[4048]: I0308 03:09:21.745921 4048 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:09:21.751341 master-0 kubenswrapper[4048]: I0308 03:09:21.751289 4048 policy_none.go:49] "None policy: Start" Mar 08 03:09:21.752780 master-0 kubenswrapper[4048]: I0308 03:09:21.752745 4048 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:09:21.752857 master-0 kubenswrapper[4048]: I0308 03:09:21.752800 4048 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:09:21.801888 master-0 kubenswrapper[4048]: E0308 03:09:21.801674 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:21.836675 master-0 kubenswrapper[4048]: I0308 03:09:21.836591 4048 manager.go:334] "Starting Device Plugin manager" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.836828 4048 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.836860 4048 server.go:79] "Starting device plugin registration server" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.837380 4048 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.837408 4048 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.837691 4048 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.837785 4048 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: I0308 03:09:21.837796 4048 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:09:21.852160 master-0 kubenswrapper[4048]: E0308 03:09:21.839321 4048 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:09:21.854530 master-0 kubenswrapper[4048]: I0308 03:09:21.854453 4048 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:09:21.857920 master-0 kubenswrapper[4048]: I0308 03:09:21.857871 4048 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:09:21.858052 master-0 kubenswrapper[4048]: I0308 03:09:21.857951 4048 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:09:21.858052 master-0 kubenswrapper[4048]: I0308 03:09:21.857979 4048 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:09:21.858052 master-0 kubenswrapper[4048]: E0308 03:09:21.858037 4048 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 08 03:09:21.859554 master-0 kubenswrapper[4048]: W0308 03:09:21.859134 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:21.859554 master-0 kubenswrapper[4048]: E0308 03:09:21.859176 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:21.905333 master-0 kubenswrapper[4048]: E0308 03:09:21.905200 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:09:21.938403 master-0 kubenswrapper[4048]: I0308 03:09:21.938309 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.940066 master-0 kubenswrapper[4048]: I0308 03:09:21.940020 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.940066 master-0 kubenswrapper[4048]: I0308 03:09:21.940061 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.940066 master-0 kubenswrapper[4048]: I0308 03:09:21.940071 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.940299 master-0 kubenswrapper[4048]: I0308 03:09:21.940099 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:21.941343 master-0 kubenswrapper[4048]: E0308 03:09:21.941258 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:21.958514 master-0 kubenswrapper[4048]: I0308 03:09:21.958359 4048 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:09:21.958514 master-0 kubenswrapper[4048]: I0308 03:09:21.958512 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.959864 master-0 kubenswrapper[4048]: I0308 03:09:21.959812 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.959864 master-0 kubenswrapper[4048]: I0308 03:09:21.959863 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.960018 master-0 kubenswrapper[4048]: I0308 03:09:21.959875 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.960079 master-0 kubenswrapper[4048]: I0308 03:09:21.960037 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.960197 master-0 kubenswrapper[4048]: I0308 03:09:21.960151 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:21.960197 master-0 kubenswrapper[4048]: I0308 03:09:21.960197 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.961103 master-0 kubenswrapper[4048]: I0308 03:09:21.961063 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.961103 master-0 kubenswrapper[4048]: I0308 03:09:21.961096 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.961103 master-0 kubenswrapper[4048]: I0308 03:09:21.961107 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.961323 master-0 kubenswrapper[4048]: I0308 03:09:21.961153 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.961323 master-0 kubenswrapper[4048]: I0308 03:09:21.961172 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.961323 master-0 kubenswrapper[4048]: I0308 03:09:21.961182 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.961323 master-0 kubenswrapper[4048]: I0308 03:09:21.961295 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.961581 master-0 kubenswrapper[4048]: I0308 03:09:21.961437 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:21.961581 master-0 kubenswrapper[4048]: I0308 03:09:21.961533 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.961917 master-0 kubenswrapper[4048]: I0308 03:09:21.961877 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.961917 master-0 kubenswrapper[4048]: I0308 03:09:21.961904 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.961917 master-0 kubenswrapper[4048]: I0308 03:09:21.961914 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.962088 master-0 kubenswrapper[4048]: I0308 03:09:21.962019 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.962147 master-0 kubenswrapper[4048]: I0308 03:09:21.962108 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:21.962147 master-0 kubenswrapper[4048]: I0308 03:09:21.962133 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.962929 master-0 kubenswrapper[4048]: I0308 03:09:21.962871 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.962929 master-0 kubenswrapper[4048]: I0308 03:09:21.962891 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.962929 master-0 kubenswrapper[4048]: I0308 03:09:21.962893 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.962929 master-0 kubenswrapper[4048]: I0308 03:09:21.962926 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.962933 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.962969 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.962987 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.963040 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.962944 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.963084 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.963223 master-0 kubenswrapper[4048]: I0308 03:09:21.963215 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:21.963795 master-0 kubenswrapper[4048]: I0308 03:09:21.963241 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.963795 master-0 kubenswrapper[4048]: I0308 03:09:21.963751 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.963795 master-0 kubenswrapper[4048]: I0308 03:09:21.963774 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.963795 master-0 kubenswrapper[4048]: I0308 03:09:21.963785 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.964056 master-0 kubenswrapper[4048]: I0308 03:09:21.963887 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:21.964056 master-0 kubenswrapper[4048]: I0308 03:09:21.963906 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:21.964056 master-0 kubenswrapper[4048]: I0308 03:09:21.963990 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.964056 master-0 kubenswrapper[4048]: I0308 03:09:21.964027 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.964056 master-0 kubenswrapper[4048]: I0308 03:09:21.964043 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:21.964606 master-0 kubenswrapper[4048]: I0308 03:09:21.964566 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:21.964606 master-0 kubenswrapper[4048]: I0308 03:09:21.964602 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:21.964859 master-0 kubenswrapper[4048]: I0308 03:09:21.964614 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:22.003231 master-0 kubenswrapper[4048]: I0308 03:09:22.003118 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.103884 master-0 kubenswrapper[4048]: I0308 03:09:22.103776 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.103929 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.103986 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.104024 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.104063 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.104026 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.104146 master-0 kubenswrapper[4048]: I0308 03:09:22.104096 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104169 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104247 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104299 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104368 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104390 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104413 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104435 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104455 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104545 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104569 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.104665 master-0 kubenswrapper[4048]: I0308 03:09:22.104609 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.142283 master-0 kubenswrapper[4048]: I0308 03:09:22.142164 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:22.143642 master-0 kubenswrapper[4048]: I0308 03:09:22.143561 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:22.143781 master-0 kubenswrapper[4048]: I0308 03:09:22.143664 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:22.143781 master-0 kubenswrapper[4048]: I0308 03:09:22.143690 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:22.143781 master-0 kubenswrapper[4048]: I0308 03:09:22.143775 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:22.145109 master-0 kubenswrapper[4048]: E0308 03:09:22.144996 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.204885 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.204939 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.204956 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.204974 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.204989 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.205003 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.205017 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.204987 master-0 kubenswrapper[4048]: I0308 03:09:22.205031 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205081 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205165 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205175 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205217 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205266 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205240 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205272 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205291 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205206 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205219 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205382 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205356 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205559 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205556 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205674 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205725 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.205846 master-0 kubenswrapper[4048]: I0308 03:09:22.205767 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205807 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205817 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205878 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205901 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205919 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205949 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.206865 master-0 kubenswrapper[4048]: I0308 03:09:22.205950 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.302375 master-0 kubenswrapper[4048]: I0308 03:09:22.302282 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:09:22.307219 master-0 kubenswrapper[4048]: E0308 03:09:22.307138 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:09:22.331612 master-0 kubenswrapper[4048]: I0308 03:09:22.331368 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:22.342066 master-0 kubenswrapper[4048]: I0308 03:09:22.341971 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:09:22.356646 master-0 kubenswrapper[4048]: I0308 03:09:22.356562 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:22.367669 master-0 kubenswrapper[4048]: I0308 03:09:22.367588 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:09:22.545270 master-0 kubenswrapper[4048]: I0308 03:09:22.545179 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:22.546863 master-0 kubenswrapper[4048]: I0308 03:09:22.546810 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:22.546938 master-0 kubenswrapper[4048]: I0308 03:09:22.546881 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:22.546938 master-0 kubenswrapper[4048]: I0308 03:09:22.546906 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:22.547056 master-0 kubenswrapper[4048]: I0308 03:09:22.546970 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:22.548314 master-0 kubenswrapper[4048]: E0308 03:09:22.548250 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:22.690114 master-0 kubenswrapper[4048]: I0308 03:09:22.689774 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:22.784800 master-0 kubenswrapper[4048]: W0308 03:09:22.784698 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:22.784800 master-0 kubenswrapper[4048]: E0308 03:09:22.784778 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:22.832668 master-0 kubenswrapper[4048]: W0308 03:09:22.832539 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:22.832668 master-0 kubenswrapper[4048]: E0308 03:09:22.832652 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:22.955345 master-0 kubenswrapper[4048]: W0308 03:09:22.955238 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8 WatchSource:0}: Error finding container 10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8: Status 404 returned error can't find the container with id 10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8 Mar 08 03:09:22.966336 master-0 kubenswrapper[4048]: I0308 03:09:22.966307 4048 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:09:22.985239 master-0 kubenswrapper[4048]: W0308 03:09:22.985176 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74 WatchSource:0}: Error finding container d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74: Status 404 returned error can't find the container with id d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74 Mar 08 03:09:23.008973 master-0 kubenswrapper[4048]: W0308 03:09:23.008924 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778 WatchSource:0}: Error finding container 95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778: Status 404 returned error can't find the container with id 95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778 Mar 08 03:09:23.044156 master-0 kubenswrapper[4048]: W0308 03:09:23.044120 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0 WatchSource:0}: Error finding container 2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0: Status 404 returned error can't find the container with id 2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0 Mar 08 03:09:23.083062 master-0 kubenswrapper[4048]: W0308 03:09:23.083001 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:23.083062 master-0 kubenswrapper[4048]: E0308 03:09:23.083064 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:23.108372 master-0 kubenswrapper[4048]: E0308 03:09:23.108303 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:09:23.280094 master-0 kubenswrapper[4048]: W0308 03:09:23.279901 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:23.280094 master-0 kubenswrapper[4048]: E0308 03:09:23.280033 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:23.349228 master-0 kubenswrapper[4048]: I0308 03:09:23.349143 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:23.350507 master-0 kubenswrapper[4048]: I0308 03:09:23.350405 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:23.350710 master-0 kubenswrapper[4048]: I0308 03:09:23.350513 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:23.350710 master-0 kubenswrapper[4048]: I0308 03:09:23.350540 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:23.350710 master-0 kubenswrapper[4048]: I0308 03:09:23.350619 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:23.352059 master-0 kubenswrapper[4048]: E0308 03:09:23.352003 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:23.689245 master-0 kubenswrapper[4048]: I0308 03:09:23.689181 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:23.782628 master-0 kubenswrapper[4048]: I0308 03:09:23.782591 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:09:23.783706 master-0 kubenswrapper[4048]: E0308 03:09:23.783659 4048 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:23.865669 master-0 kubenswrapper[4048]: I0308 03:09:23.865563 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0"} Mar 08 03:09:23.866414 master-0 kubenswrapper[4048]: I0308 03:09:23.866392 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778"} Mar 08 03:09:23.867342 master-0 kubenswrapper[4048]: I0308 03:09:23.867306 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74"} Mar 08 03:09:23.868186 master-0 kubenswrapper[4048]: I0308 03:09:23.868145 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214"} Mar 08 03:09:23.868944 master-0 kubenswrapper[4048]: I0308 03:09:23.868916 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8"} Mar 08 03:09:24.622179 master-0 kubenswrapper[4048]: W0308 03:09:24.622090 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:24.622409 master-0 kubenswrapper[4048]: E0308 03:09:24.622194 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:24.689780 master-0 kubenswrapper[4048]: I0308 03:09:24.689714 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:24.709976 master-0 kubenswrapper[4048]: E0308 03:09:24.709898 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:09:24.952917 master-0 kubenswrapper[4048]: I0308 03:09:24.952785 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:24.953933 master-0 kubenswrapper[4048]: I0308 03:09:24.953903 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:24.954041 master-0 kubenswrapper[4048]: I0308 03:09:24.953948 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:24.954041 master-0 kubenswrapper[4048]: I0308 03:09:24.953964 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:24.954041 master-0 kubenswrapper[4048]: I0308 03:09:24.954023 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:24.954952 master-0 kubenswrapper[4048]: E0308 03:09:24.954907 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:25.018100 master-0 kubenswrapper[4048]: E0308 03:09:25.017966 4048 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abef4bd9b2a85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,LastTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:25.171881 master-0 kubenswrapper[4048]: W0308 03:09:25.171852 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:25.171958 master-0 kubenswrapper[4048]: E0308 03:09:25.171897 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:25.522655 master-0 kubenswrapper[4048]: W0308 03:09:25.522578 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:25.522729 master-0 kubenswrapper[4048]: E0308 03:09:25.522664 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:25.689555 master-0 kubenswrapper[4048]: I0308 03:09:25.689495 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:25.845039 master-0 kubenswrapper[4048]: W0308 03:09:25.844931 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:25.845039 master-0 kubenswrapper[4048]: E0308 03:09:25.844989 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:25.874610 master-0 kubenswrapper[4048]: I0308 03:09:25.874547 4048 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755" exitCode=0 Mar 08 03:09:25.874733 master-0 kubenswrapper[4048]: I0308 03:09:25.874642 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755"} Mar 08 03:09:25.874733 master-0 kubenswrapper[4048]: I0308 03:09:25.874687 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:25.875550 master-0 kubenswrapper[4048]: I0308 03:09:25.875513 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:25.875598 master-0 kubenswrapper[4048]: I0308 03:09:25.875569 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:25.875598 master-0 kubenswrapper[4048]: I0308 03:09:25.875586 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.876561 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1"} Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.876580 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044"} Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.876622 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.877373 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.877399 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:25.878024 master-0 kubenswrapper[4048]: I0308 03:09:25.877412 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:26.690023 master-0 kubenswrapper[4048]: I0308 03:09:26.689977 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:26.880202 master-0 kubenswrapper[4048]: I0308 03:09:26.880159 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 03:09:26.880928 master-0 kubenswrapper[4048]: I0308 03:09:26.880658 4048 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="457e72591925a8308611a18637cdfa608e4f1e77bae782ec59c09a030057dc8f" exitCode=1 Mar 08 03:09:26.880928 master-0 kubenswrapper[4048]: I0308 03:09:26.880754 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:26.880928 master-0 kubenswrapper[4048]: I0308 03:09:26.880755 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:26.881226 master-0 kubenswrapper[4048]: I0308 03:09:26.881195 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"457e72591925a8308611a18637cdfa608e4f1e77bae782ec59c09a030057dc8f"} Mar 08 03:09:26.881715 master-0 kubenswrapper[4048]: I0308 03:09:26.881684 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:26.881715 master-0 kubenswrapper[4048]: I0308 03:09:26.881714 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:26.881797 master-0 kubenswrapper[4048]: I0308 03:09:26.881726 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:26.881797 master-0 kubenswrapper[4048]: I0308 03:09:26.881757 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:26.881797 master-0 kubenswrapper[4048]: I0308 03:09:26.881779 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:26.881797 master-0 kubenswrapper[4048]: I0308 03:09:26.881787 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:26.882009 master-0 kubenswrapper[4048]: I0308 03:09:26.881989 4048 scope.go:117] "RemoveContainer" containerID="457e72591925a8308611a18637cdfa608e4f1e77bae782ec59c09a030057dc8f" Mar 08 03:09:27.690002 master-0 kubenswrapper[4048]: I0308 03:09:27.689919 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:27.884965 master-0 kubenswrapper[4048]: I0308 03:09:27.884899 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:09:27.885557 master-0 kubenswrapper[4048]: I0308 03:09:27.885513 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 08 03:09:27.885956 master-0 kubenswrapper[4048]: I0308 03:09:27.885921 4048 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338" exitCode=1 Mar 08 03:09:27.885995 master-0 kubenswrapper[4048]: I0308 03:09:27.885953 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338"} Mar 08 03:09:27.886027 master-0 kubenswrapper[4048]: I0308 03:09:27.885997 4048 scope.go:117] "RemoveContainer" containerID="457e72591925a8308611a18637cdfa608e4f1e77bae782ec59c09a030057dc8f" Mar 08 03:09:27.886115 master-0 kubenswrapper[4048]: I0308 03:09:27.886087 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:27.886992 master-0 kubenswrapper[4048]: I0308 03:09:27.886960 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:27.886992 master-0 kubenswrapper[4048]: I0308 03:09:27.886985 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:27.886992 master-0 kubenswrapper[4048]: I0308 03:09:27.886994 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:27.887260 master-0 kubenswrapper[4048]: I0308 03:09:27.887234 4048 scope.go:117] "RemoveContainer" containerID="ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338" Mar 08 03:09:27.887406 master-0 kubenswrapper[4048]: E0308 03:09:27.887376 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:09:27.911402 master-0 kubenswrapper[4048]: E0308 03:09:27.911338 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:09:28.141789 master-0 kubenswrapper[4048]: I0308 03:09:28.141697 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:09:28.143022 master-0 kubenswrapper[4048]: E0308 03:09:28.142969 4048 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:28.155376 master-0 kubenswrapper[4048]: I0308 03:09:28.155332 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:28.156344 master-0 kubenswrapper[4048]: I0308 03:09:28.156313 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:28.156344 master-0 kubenswrapper[4048]: I0308 03:09:28.156341 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:28.156344 master-0 kubenswrapper[4048]: I0308 03:09:28.156350 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:28.156607 master-0 kubenswrapper[4048]: I0308 03:09:28.156395 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:28.157053 master-0 kubenswrapper[4048]: E0308 03:09:28.157000 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:09:28.690012 master-0 kubenswrapper[4048]: I0308 03:09:28.689962 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:28.888416 master-0 kubenswrapper[4048]: I0308 03:09:28.888374 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:28.889091 master-0 kubenswrapper[4048]: I0308 03:09:28.889055 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:28.889131 master-0 kubenswrapper[4048]: I0308 03:09:28.889113 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:28.889159 master-0 kubenswrapper[4048]: I0308 03:09:28.889131 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:28.889666 master-0 kubenswrapper[4048]: I0308 03:09:28.889642 4048 scope.go:117] "RemoveContainer" containerID="ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338" Mar 08 03:09:28.889887 master-0 kubenswrapper[4048]: E0308 03:09:28.889850 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:09:29.172859 master-0 kubenswrapper[4048]: W0308 03:09:29.172788 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:29.173050 master-0 kubenswrapper[4048]: E0308 03:09:29.172872 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:29.353503 master-0 kubenswrapper[4048]: W0308 03:09:29.353387 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:29.353672 master-0 kubenswrapper[4048]: E0308 03:09:29.353530 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:29.689832 master-0 kubenswrapper[4048]: I0308 03:09:29.689762 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:29.852603 master-0 kubenswrapper[4048]: W0308 03:09:29.852444 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:29.852846 master-0 kubenswrapper[4048]: E0308 03:09:29.852635 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:30.690109 master-0 kubenswrapper[4048]: I0308 03:09:30.690018 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:30.896205 master-0 kubenswrapper[4048]: I0308 03:09:30.895882 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:09:31.125997 master-0 kubenswrapper[4048]: W0308 03:09:31.125855 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:31.125997 master-0 kubenswrapper[4048]: E0308 03:09:31.125956 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:09:31.689928 master-0 kubenswrapper[4048]: I0308 03:09:31.689879 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:09:31.839608 master-0 kubenswrapper[4048]: E0308 03:09:31.839564 4048 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:09:31.900539 master-0 kubenswrapper[4048]: I0308 03:09:31.900395 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459"} Mar 08 03:09:31.900539 master-0 kubenswrapper[4048]: I0308 03:09:31.900448 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:31.901066 master-0 kubenswrapper[4048]: I0308 03:09:31.901003 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:31.901066 master-0 kubenswrapper[4048]: I0308 03:09:31.901030 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:31.901066 master-0 kubenswrapper[4048]: I0308 03:09:31.901037 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:31.902107 master-0 kubenswrapper[4048]: I0308 03:09:31.901969 4048 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619" exitCode=0 Mar 08 03:09:31.902107 master-0 kubenswrapper[4048]: I0308 03:09:31.902021 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:31.902107 master-0 kubenswrapper[4048]: I0308 03:09:31.902048 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619"} Mar 08 03:09:31.902665 master-0 kubenswrapper[4048]: I0308 03:09:31.902636 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:31.902665 master-0 kubenswrapper[4048]: I0308 03:09:31.902670 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:31.902665 master-0 kubenswrapper[4048]: I0308 03:09:31.902682 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:31.904259 master-0 kubenswrapper[4048]: I0308 03:09:31.904215 4048 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42" exitCode=1 Mar 08 03:09:31.904259 master-0 kubenswrapper[4048]: I0308 03:09:31.904240 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42"} Mar 08 03:09:31.904707 master-0 kubenswrapper[4048]: I0308 03:09:31.904677 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:31.905207 master-0 kubenswrapper[4048]: I0308 03:09:31.905145 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:31.905207 master-0 kubenswrapper[4048]: I0308 03:09:31.905172 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:31.905207 master-0 kubenswrapper[4048]: I0308 03:09:31.905184 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:32.909769 master-0 kubenswrapper[4048]: I0308 03:09:32.909169 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"66dcb2ef9f56c8175e9938f33a7650abc0b5ef0e638ee33a15fd5eee5cc90aba"} Mar 08 03:09:32.909769 master-0 kubenswrapper[4048]: I0308 03:09:32.909223 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:32.911953 master-0 kubenswrapper[4048]: I0308 03:09:32.910231 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:32.911953 master-0 kubenswrapper[4048]: I0308 03:09:32.911603 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:32.911953 master-0 kubenswrapper[4048]: I0308 03:09:32.911623 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:33.763143 master-0 kubenswrapper[4048]: I0308 03:09:33.761415 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:33.913821 master-0 kubenswrapper[4048]: I0308 03:09:33.913763 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca"} Mar 08 03:09:33.914514 master-0 kubenswrapper[4048]: I0308 03:09:33.913891 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:33.914992 master-0 kubenswrapper[4048]: I0308 03:09:33.914960 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:33.915025 master-0 kubenswrapper[4048]: I0308 03:09:33.914993 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:33.915025 master-0 kubenswrapper[4048]: I0308 03:09:33.915003 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:33.920752 master-0 kubenswrapper[4048]: I0308 03:09:33.920692 4048 scope.go:117] "RemoveContainer" containerID="a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42" Mar 08 03:09:34.320576 master-0 kubenswrapper[4048]: E0308 03:09:34.319514 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:09:34.557981 master-0 kubenswrapper[4048]: I0308 03:09:34.557916 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:34.558906 master-0 kubenswrapper[4048]: I0308 03:09:34.558864 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:34.558906 master-0 kubenswrapper[4048]: I0308 03:09:34.558888 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:34.558906 master-0 kubenswrapper[4048]: I0308 03:09:34.558896 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:34.559199 master-0 kubenswrapper[4048]: I0308 03:09:34.558928 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:34.562767 master-0 kubenswrapper[4048]: E0308 03:09:34.562729 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:09:34.697284 master-0 kubenswrapper[4048]: I0308 03:09:34.697225 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:34.918281 master-0 kubenswrapper[4048]: I0308 03:09:34.918221 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a"} Mar 08 03:09:34.918880 master-0 kubenswrapper[4048]: I0308 03:09:34.918319 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:34.918880 master-0 kubenswrapper[4048]: I0308 03:09:34.918875 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:34.918974 master-0 kubenswrapper[4048]: I0308 03:09:34.918893 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:34.918974 master-0 kubenswrapper[4048]: I0308 03:09:34.918901 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:35.023390 master-0 kubenswrapper[4048]: E0308 03:09:35.023214 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4bd9b2a85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,LastTimestamp:2026-03-08 03:09:21.686071941 +0000 UTC m=+0.651544542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.027026 master-0 kubenswrapper[4048]: E0308 03:09:35.026957 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.030897 master-0 kubenswrapper[4048]: E0308 03:09:35.030826 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.034678 master-0 kubenswrapper[4048]: E0308 03:09:35.034560 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.038411 master-0 kubenswrapper[4048]: E0308 03:09:35.038223 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c6d7096b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.840990571 +0000 UTC m=+0.806463152,LastTimestamp:2026-03-08 03:09:21.840990571 +0000 UTC m=+0.806463152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.045869 master-0 kubenswrapper[4048]: E0308 03:09:35.045754 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.940044303 +0000 UTC m=+0.905516874,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.050202 master-0 kubenswrapper[4048]: E0308 03:09:35.050117 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.940067925 +0000 UTC m=+0.905540496,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.054890 master-0 kubenswrapper[4048]: E0308 03:09:35.054777 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.940078036 +0000 UTC m=+0.905550617,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.059371 master-0 kubenswrapper[4048]: E0308 03:09:35.059245 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.959847847 +0000 UTC m=+0.925320418,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.063434 master-0 kubenswrapper[4048]: E0308 03:09:35.063351 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.959869819 +0000 UTC m=+0.925342390,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.067191 master-0 kubenswrapper[4048]: E0308 03:09:35.067071 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.95988209 +0000 UTC m=+0.925354671,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.071430 master-0 kubenswrapper[4048]: E0308 03:09:35.071306 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.961083646 +0000 UTC m=+0.926556217,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.075044 master-0 kubenswrapper[4048]: E0308 03:09:35.074969 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.961102898 +0000 UTC m=+0.926575479,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.078853 master-0 kubenswrapper[4048]: E0308 03:09:35.078758 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.961114138 +0000 UTC m=+0.926586709,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.083207 master-0 kubenswrapper[4048]: E0308 03:09:35.083123 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.961165312 +0000 UTC m=+0.926637893,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.087058 master-0 kubenswrapper[4048]: E0308 03:09:35.086989 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.961178093 +0000 UTC m=+0.926650664,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.091282 master-0 kubenswrapper[4048]: E0308 03:09:35.091215 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.961187484 +0000 UTC m=+0.926660055,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.096760 master-0 kubenswrapper[4048]: E0308 03:09:35.096698 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.961898525 +0000 UTC m=+0.927371096,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.103750 master-0 kubenswrapper[4048]: E0308 03:09:35.103693 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.961910476 +0000 UTC m=+0.927383047,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.110174 master-0 kubenswrapper[4048]: E0308 03:09:35.110101 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.961920446 +0000 UTC m=+0.927393017,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.115705 master-0 kubenswrapper[4048]: E0308 03:09:35.115648 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.962885656 +0000 UTC m=+0.928358227,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.119978 master-0 kubenswrapper[4048]: E0308 03:09:35.119887 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11ef489\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11ef489 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745040521 +0000 UTC m=+0.710513102,LastTimestamp:2026-03-08 03:09:21.962899127 +0000 UTC m=+0.928371708,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.123494 master-0 kubenswrapper[4048]: E0308 03:09:35.123398 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.962924509 +0000 UTC m=+0.928397110,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.127286 master-0 kubenswrapper[4048]: E0308 03:09:35.127195 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11f2018\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11f2018 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.745051672 +0000 UTC m=+0.710524253,LastTimestamp:2026-03-08 03:09:21.962933899 +0000 UTC m=+0.928406480,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.131021 master-0 kubenswrapper[4048]: E0308 03:09:35.130964 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189abef4c11eb7a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189abef4c11eb7a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:21.74502493 +0000 UTC m=+0.710497511,LastTimestamp:2026-03-08 03:09:21.962958881 +0000 UTC m=+0.928431492,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.135119 master-0 kubenswrapper[4048]: E0308 03:09:35.135050 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef509e93b24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:22.96625642 +0000 UTC m=+1.931729011,LastTimestamp:2026-03-08 03:09:22.96625642 +0000 UTC m=+1.931729011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.140306 master-0 kubenswrapper[4048]: E0308 03:09:35.140205 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef50a0465c3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:22.968036803 +0000 UTC m=+1.933509384,LastTimestamp:2026-03-08 03:09:22.968036803 +0000 UTC m=+1.933509384,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.143787 master-0 kubenswrapper[4048]: E0308 03:09:35.143715 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef50b5c824b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:22.990588491 +0000 UTC m=+1.956061072,LastTimestamp:2026-03-08 03:09:22.990588491 +0000 UTC m=+1.956061072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.149414 master-0 kubenswrapper[4048]: E0308 03:09:35.149319 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189abef50ca81d83 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:23.012320643 +0000 UTC m=+1.977793224,LastTimestamp:2026-03-08 03:09:23.012320643 +0000 UTC m=+1.977793224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.153743 master-0 kubenswrapper[4048]: E0308 03:09:35.153627 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef50eb1bdd0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:23.046505936 +0000 UTC m=+2.011978507,LastTimestamp:2026-03-08 03:09:23.046505936 +0000 UTC m=+2.011978507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.156125 master-0 kubenswrapper[4048]: I0308 03:09:35.156035 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:35.157786 master-0 kubenswrapper[4048]: E0308 03:09:35.157719 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef58c130fdb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 2.181s (2.181s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.150035931 +0000 UTC m=+4.115508502,LastTimestamp:2026-03-08 03:09:25.150035931 +0000 UTC m=+4.115508502,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.159072 master-0 kubenswrapper[4048]: I0308 03:09:35.159033 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:35.161281 master-0 kubenswrapper[4048]: E0308 03:09:35.161186 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef58df069f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 2.19s (2.19s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.181319667 +0000 UTC m=+4.146792238,LastTimestamp:2026-03-08 03:09:25.181319667 +0000 UTC m=+4.146792238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.165549 master-0 kubenswrapper[4048]: E0308 03:09:35.165474 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef599ef9aa2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.382593186 +0000 UTC m=+4.348065757,LastTimestamp:2026-03-08 03:09:25.382593186 +0000 UTC m=+4.348065757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.172261 master-0 kubenswrapper[4048]: E0308 03:09:35.171781 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef599f4d1a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.382934947 +0000 UTC m=+4.348407518,LastTimestamp:2026-03-08 03:09:25.382934947 +0000 UTC m=+4.348407518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.176305 master-0 kubenswrapper[4048]: E0308 03:09:35.176224 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef59a81cc54 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.392174164 +0000 UTC m=+4.357646735,LastTimestamp:2026-03-08 03:09:25.392174164 +0000 UTC m=+4.357646735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.180234 master-0 kubenswrapper[4048]: E0308 03:09:35.180170 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef59afe0a83 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.400316547 +0000 UTC m=+4.365789108,LastTimestamp:2026-03-08 03:09:25.400316547 +0000 UTC m=+4.365789108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.184091 master-0 kubenswrapper[4048]: E0308 03:09:35.183993 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef59b2675e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.402965478 +0000 UTC m=+4.368438039,LastTimestamp:2026-03-08 03:09:25.402965478 +0000 UTC m=+4.368438039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.187849 master-0 kubenswrapper[4048]: E0308 03:09:35.187783 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef5a62a9137 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.587783991 +0000 UTC m=+4.553256562,LastTimestamp:2026-03-08 03:09:25.587783991 +0000 UTC m=+4.553256562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.191356 master-0 kubenswrapper[4048]: E0308 03:09:35.191282 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abef5a72faa30 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.60489528 +0000 UTC m=+4.570367851,LastTimestamp:2026-03-08 03:09:25.60489528 +0000 UTC m=+4.570367851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.195008 master-0 kubenswrapper[4048]: E0308 03:09:35.194949 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5b77616c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.877946048 +0000 UTC m=+4.843418629,LastTimestamp:2026-03-08 03:09:25.877946048 +0000 UTC m=+4.843418629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.198672 master-0 kubenswrapper[4048]: E0308 03:09:35.198603 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c32223d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.073770965 +0000 UTC m=+5.039243536,LastTimestamp:2026-03-08 03:09:26.073770965 +0000 UTC m=+5.039243536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.202117 master-0 kubenswrapper[4048]: E0308 03:09:35.202049 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c46f1881 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.095591553 +0000 UTC m=+5.061064124,LastTimestamp:2026-03-08 03:09:26.095591553 +0000 UTC m=+5.061064124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.206096 master-0 kubenswrapper[4048]: E0308 03:09:35.205961 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5b77616c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5b77616c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.877946048 +0000 UTC m=+4.843418629,LastTimestamp:2026-03-08 03:09:26.888068514 +0000 UTC m=+5.853541085,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.209452 master-0 kubenswrapper[4048]: E0308 03:09:35.209394 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5c32223d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c32223d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.073770965 +0000 UTC m=+5.039243536,LastTimestamp:2026-03-08 03:09:27.083360279 +0000 UTC m=+6.048832920,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.212908 master-0 kubenswrapper[4048]: E0308 03:09:35.212815 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5c46f1881\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c46f1881 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.095591553 +0000 UTC m=+5.061064124,LastTimestamp:2026-03-08 03:09:27.096039362 +0000 UTC m=+6.061511933,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.217000 master-0 kubenswrapper[4048]: E0308 03:09:35.216884 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef62f3b22d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:27.887348441 +0000 UTC m=+6.852821012,LastTimestamp:2026-03-08 03:09:27.887348441 +0000 UTC m=+6.852821012,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.220498 master-0 kubenswrapper[4048]: E0308 03:09:35.220427 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef62f3b22d9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef62f3b22d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:27.887348441 +0000 UTC m=+6.852821012,LastTimestamp:2026-03-08 03:09:28.889818886 +0000 UTC m=+7.855291487,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.225474 master-0 kubenswrapper[4048]: E0308 03:09:35.225400 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef6d6aeed12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.73s (7.73s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.696731922 +0000 UTC m=+9.662204543,LastTimestamp:2026-03-08 03:09:30.696731922 +0000 UTC m=+9.662204543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.228958 master-0 kubenswrapper[4048]: E0308 03:09:35.228874 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189abef6d892c52a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.716s (7.716s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.72844113 +0000 UTC m=+9.693913691,LastTimestamp:2026-03-08 03:09:30.72844113 +0000 UTC m=+9.693913691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.234338 master-0 kubenswrapper[4048]: E0308 03:09:35.234193 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6d9ca836f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.702s (7.702s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.748871535 +0000 UTC m=+9.714344166,LastTimestamp:2026-03-08 03:09:30.748871535 +0000 UTC m=+9.714344166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.238081 master-0 kubenswrapper[4048]: E0308 03:09:35.238021 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef6e5c0898d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.949544333 +0000 UTC m=+9.915016934,LastTimestamp:2026-03-08 03:09:30.949544333 +0000 UTC m=+9.915016934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.241636 master-0 kubenswrapper[4048]: E0308 03:09:35.241569 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef6e6e8c4cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.968958159 +0000 UTC m=+9.934430740,LastTimestamp:2026-03-08 03:09:30.968958159 +0000 UTC m=+9.934430740,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.244593 master-0 kubenswrapper[4048]: E0308 03:09:35.244516 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189abef6e6f5ab17 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.969803543 +0000 UTC m=+9.935276124,LastTimestamp:2026-03-08 03:09:30.969803543 +0000 UTC m=+9.935276124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.247607 master-0 kubenswrapper[4048]: E0308 03:09:35.247551 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6e6f7fd7c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.969955708 +0000 UTC m=+9.935428309,LastTimestamp:2026-03-08 03:09:30.969955708 +0000 UTC m=+9.935428309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.250529 master-0 kubenswrapper[4048]: E0308 03:09:35.250464 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6e7731fdf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.978025439 +0000 UTC m=+9.943498020,LastTimestamp:2026-03-08 03:09:30.978025439 +0000 UTC m=+9.943498020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.254461 master-0 kubenswrapper[4048]: E0308 03:09:35.254400 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6e783edc0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.97912672 +0000 UTC m=+9.944599331,LastTimestamp:2026-03-08 03:09:30.97912672 +0000 UTC m=+9.944599331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.258410 master-0 kubenswrapper[4048]: E0308 03:09:35.258357 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189abef6e7a258c6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.981120198 +0000 UTC m=+9.946592779,LastTimestamp:2026-03-08 03:09:30.981120198 +0000 UTC m=+9.946592779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.262250 master-0 kubenswrapper[4048]: E0308 03:09:35.262196 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef71eae1469 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:31.904636009 +0000 UTC m=+10.870108580,LastTimestamp:2026-03-08 03:09:31.904636009 +0000 UTC m=+10.870108580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.265685 master-0 kubenswrapper[4048]: E0308 03:09:35.265624 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef72b241f57 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:32.113698647 +0000 UTC m=+11.079171258,LastTimestamp:2026-03-08 03:09:32.113698647 +0000 UTC m=+11.079171258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.269010 master-0 kubenswrapper[4048]: E0308 03:09:35.268944 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef72bfddd86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:32.127968646 +0000 UTC m=+11.093441247,LastTimestamp:2026-03-08 03:09:32.127968646 +0000 UTC m=+11.093441247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.274055 master-0 kubenswrapper[4048]: E0308 03:09:35.273958 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef72c119bf7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:32.129262583 +0000 UTC m=+11.094735194,LastTimestamp:2026-03-08 03:09:32.129262583 +0000 UTC m=+11.094735194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.277515 master-0 kubenswrapper[4048]: E0308 03:09:35.277425 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef764d8814d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 2.102s (2.102s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:33.081821517 +0000 UTC m=+12.047294088,LastTimestamp:2026-03-08 03:09:33.081821517 +0000 UTC m=+12.047294088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.282367 master-0 kubenswrapper[4048]: E0308 03:09:35.282311 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef770156842 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:33.270362178 +0000 UTC m=+12.235834749,LastTimestamp:2026-03-08 03:09:33.270362178 +0000 UTC m=+12.235834749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.285443 master-0 kubenswrapper[4048]: E0308 03:09:35.285371 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef770a9153f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:33.280040255 +0000 UTC m=+12.245512826,LastTimestamp:2026-03-08 03:09:33.280040255 +0000 UTC m=+12.245512826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.289930 master-0 kubenswrapper[4048]: E0308 03:09:35.289865 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef7971d2a48 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:33.925182024 +0000 UTC m=+12.890654595,LastTimestamp:2026-03-08 03:09:33.925182024 +0000 UTC m=+12.890654595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.293744 master-0 kubenswrapper[4048]: E0308 03:09:35.293676 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189abef6e6f7fd7c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6e6f7fd7c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.969955708 +0000 UTC m=+9.935428309,LastTimestamp:2026-03-08 03:09:34.266228253 +0000 UTC m=+13.231700824,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.297640 master-0 kubenswrapper[4048]: E0308 03:09:35.297573 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189abef6e7731fdf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189abef6e7731fdf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:30.978025439 +0000 UTC m=+9.943498020,LastTimestamp:2026-03-08 03:09:34.332174972 +0000 UTC m=+13.297647553,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.383532 master-0 kubenswrapper[4048]: I0308 03:09:35.383467 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:35.546640 master-0 kubenswrapper[4048]: E0308 03:09:35.546457 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef7f77b9211 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 3.412s (3.412s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:35.541981713 +0000 UTC m=+14.507454294,LastTimestamp:2026-03-08 03:09:35.541981713 +0000 UTC m=+14.507454294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.696040 master-0 kubenswrapper[4048]: I0308 03:09:35.695974 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:35.754039 master-0 kubenswrapper[4048]: E0308 03:09:35.753835 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef803b29859 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:35.746914393 +0000 UTC m=+14.712386994,LastTimestamp:2026-03-08 03:09:35.746914393 +0000 UTC m=+14.712386994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.767477 master-0 kubenswrapper[4048]: E0308 03:09:35.767305 4048 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189abef804795c30 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:35.759940656 +0000 UTC m=+14.725413267,LastTimestamp:2026-03-08 03:09:35.759940656 +0000 UTC m=+14.725413267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:35.923323 master-0 kubenswrapper[4048]: I0308 03:09:35.923256 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb"} Mar 08 03:09:35.923323 master-0 kubenswrapper[4048]: I0308 03:09:35.923295 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:35.923323 master-0 kubenswrapper[4048]: I0308 03:09:35.923282 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:35.924409 master-0 kubenswrapper[4048]: I0308 03:09:35.924321 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:35.924409 master-0 kubenswrapper[4048]: I0308 03:09:35.924344 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:35.924409 master-0 kubenswrapper[4048]: I0308 03:09:35.924372 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:35.924409 master-0 kubenswrapper[4048]: I0308 03:09:35.924389 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:35.924409 master-0 kubenswrapper[4048]: I0308 03:09:35.924353 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:35.924704 master-0 kubenswrapper[4048]: I0308 03:09:35.924468 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:36.700309 master-0 kubenswrapper[4048]: I0308 03:09:36.700244 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:36.835839 master-0 kubenswrapper[4048]: I0308 03:09:36.835728 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 03:09:36.856965 master-0 kubenswrapper[4048]: I0308 03:09:36.856917 4048 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 03:09:36.925384 master-0 kubenswrapper[4048]: I0308 03:09:36.925321 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:36.925384 master-0 kubenswrapper[4048]: I0308 03:09:36.925393 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:36.926457 master-0 kubenswrapper[4048]: I0308 03:09:36.926375 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:36.926567 master-0 kubenswrapper[4048]: I0308 03:09:36.926510 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:36.926567 master-0 kubenswrapper[4048]: I0308 03:09:36.926528 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:36.926879 master-0 kubenswrapper[4048]: I0308 03:09:36.926838 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:36.926879 master-0 kubenswrapper[4048]: I0308 03:09:36.926869 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:36.926879 master-0 kubenswrapper[4048]: I0308 03:09:36.926878 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:37.668590 master-0 kubenswrapper[4048]: W0308 03:09:37.668522 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 03:09:37.669052 master-0 kubenswrapper[4048]: E0308 03:09:37.668596 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 03:09:37.697043 master-0 kubenswrapper[4048]: I0308 03:09:37.696967 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:38.697514 master-0 kubenswrapper[4048]: I0308 03:09:38.697431 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:38.819299 master-0 kubenswrapper[4048]: I0308 03:09:38.819204 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:38.820010 master-0 kubenswrapper[4048]: I0308 03:09:38.819957 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:38.821424 master-0 kubenswrapper[4048]: I0308 03:09:38.821363 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:38.821424 master-0 kubenswrapper[4048]: I0308 03:09:38.821425 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:38.821615 master-0 kubenswrapper[4048]: I0308 03:09:38.821442 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:38.826143 master-0 kubenswrapper[4048]: I0308 03:09:38.826097 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:38.930360 master-0 kubenswrapper[4048]: I0308 03:09:38.930271 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:38.930789 master-0 kubenswrapper[4048]: I0308 03:09:38.930464 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:38.931645 master-0 kubenswrapper[4048]: I0308 03:09:38.931592 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:38.931744 master-0 kubenswrapper[4048]: I0308 03:09:38.931655 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:38.931744 master-0 kubenswrapper[4048]: I0308 03:09:38.931674 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:38.937237 master-0 kubenswrapper[4048]: I0308 03:09:38.937177 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:09:39.684226 master-0 kubenswrapper[4048]: W0308 03:09:39.647151 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:39.684226 master-0 kubenswrapper[4048]: E0308 03:09:39.647221 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 03:09:39.697325 master-0 kubenswrapper[4048]: I0308 03:09:39.697261 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:39.932813 master-0 kubenswrapper[4048]: I0308 03:09:39.932732 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:39.934129 master-0 kubenswrapper[4048]: I0308 03:09:39.934075 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:39.934218 master-0 kubenswrapper[4048]: I0308 03:09:39.934143 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:39.934218 master-0 kubenswrapper[4048]: I0308 03:09:39.934161 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:40.499804 master-0 kubenswrapper[4048]: W0308 03:09:40.499719 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 03:09:40.499804 master-0 kubenswrapper[4048]: E0308 03:09:40.499772 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 03:09:40.588422 master-0 kubenswrapper[4048]: I0308 03:09:40.588324 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:40.588699 master-0 kubenswrapper[4048]: I0308 03:09:40.588661 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:40.590847 master-0 kubenswrapper[4048]: I0308 03:09:40.590796 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:40.590933 master-0 kubenswrapper[4048]: I0308 03:09:40.590858 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:40.590933 master-0 kubenswrapper[4048]: I0308 03:09:40.590883 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:40.703787 master-0 kubenswrapper[4048]: I0308 03:09:40.703568 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:40.934925 master-0 kubenswrapper[4048]: I0308 03:09:40.934871 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:40.936068 master-0 kubenswrapper[4048]: I0308 03:09:40.936023 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:40.936144 master-0 kubenswrapper[4048]: I0308 03:09:40.936070 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:40.936144 master-0 kubenswrapper[4048]: I0308 03:09:40.936089 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:41.328605 master-0 kubenswrapper[4048]: E0308 03:09:41.328443 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:09:41.563952 master-0 kubenswrapper[4048]: I0308 03:09:41.563834 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:41.565243 master-0 kubenswrapper[4048]: I0308 03:09:41.565194 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:41.565338 master-0 kubenswrapper[4048]: I0308 03:09:41.565257 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:41.565338 master-0 kubenswrapper[4048]: I0308 03:09:41.565270 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:41.565338 master-0 kubenswrapper[4048]: I0308 03:09:41.565328 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:41.572888 master-0 kubenswrapper[4048]: E0308 03:09:41.572841 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:09:41.697362 master-0 kubenswrapper[4048]: I0308 03:09:41.697307 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:41.839835 master-0 kubenswrapper[4048]: E0308 03:09:41.839743 4048 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:09:41.905213 master-0 kubenswrapper[4048]: I0308 03:09:41.900154 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:41.905213 master-0 kubenswrapper[4048]: I0308 03:09:41.900379 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:41.905213 master-0 kubenswrapper[4048]: I0308 03:09:41.902210 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:41.905213 master-0 kubenswrapper[4048]: I0308 03:09:41.902257 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:41.905213 master-0 kubenswrapper[4048]: I0308 03:09:41.902281 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:41.911286 master-0 kubenswrapper[4048]: I0308 03:09:41.909080 4048 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:41.913278 master-0 kubenswrapper[4048]: W0308 03:09:41.913212 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 03:09:41.913278 master-0 kubenswrapper[4048]: E0308 03:09:41.913273 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 03:09:41.937779 master-0 kubenswrapper[4048]: I0308 03:09:41.937709 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:41.938959 master-0 kubenswrapper[4048]: I0308 03:09:41.938891 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:41.939054 master-0 kubenswrapper[4048]: I0308 03:09:41.938969 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:41.939054 master-0 kubenswrapper[4048]: I0308 03:09:41.938994 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:41.970844 master-0 kubenswrapper[4048]: I0308 03:09:41.970744 4048 csr.go:261] certificate signing request csr-sjvrj is approved, waiting to be issued Mar 08 03:09:42.696344 master-0 kubenswrapper[4048]: I0308 03:09:42.696268 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:43.695553 master-0 kubenswrapper[4048]: I0308 03:09:43.695467 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:43.858919 master-0 kubenswrapper[4048]: I0308 03:09:43.858840 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:43.860526 master-0 kubenswrapper[4048]: I0308 03:09:43.860432 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:43.860526 master-0 kubenswrapper[4048]: I0308 03:09:43.860526 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:43.860702 master-0 kubenswrapper[4048]: I0308 03:09:43.860555 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:43.861097 master-0 kubenswrapper[4048]: I0308 03:09:43.861051 4048 scope.go:117] "RemoveContainer" containerID="ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338" Mar 08 03:09:43.875113 master-0 kubenswrapper[4048]: E0308 03:09:43.874761 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5b77616c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5b77616c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:25.877946048 +0000 UTC m=+4.843418629,LastTimestamp:2026-03-08 03:09:43.865143144 +0000 UTC m=+22.830615745,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:44.157509 master-0 kubenswrapper[4048]: E0308 03:09:44.157300 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5c32223d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c32223d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.073770965 +0000 UTC m=+5.039243536,LastTimestamp:2026-03-08 03:09:44.148890411 +0000 UTC m=+23.114363012,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:44.175019 master-0 kubenswrapper[4048]: E0308 03:09:44.174824 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef5c46f1881\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef5c46f1881 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:26.095591553 +0000 UTC m=+5.061064124,LastTimestamp:2026-03-08 03:09:44.167307689 +0000 UTC m=+23.132780300,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:44.696470 master-0 kubenswrapper[4048]: I0308 03:09:44.696342 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:44.949108 master-0 kubenswrapper[4048]: I0308 03:09:44.948928 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:09:44.949781 master-0 kubenswrapper[4048]: I0308 03:09:44.949730 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 08 03:09:44.950821 master-0 kubenswrapper[4048]: I0308 03:09:44.950750 4048 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" exitCode=1 Mar 08 03:09:44.950955 master-0 kubenswrapper[4048]: I0308 03:09:44.950818 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a"} Mar 08 03:09:44.950955 master-0 kubenswrapper[4048]: I0308 03:09:44.950880 4048 scope.go:117] "RemoveContainer" containerID="ff1ba2dce4dc35159df67fd790ee0dc1ab08a72be2900e3faf5470e7b67ef338" Mar 08 03:09:44.951120 master-0 kubenswrapper[4048]: I0308 03:09:44.951005 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:44.952181 master-0 kubenswrapper[4048]: I0308 03:09:44.952136 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:44.952318 master-0 kubenswrapper[4048]: I0308 03:09:44.952187 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:44.952318 master-0 kubenswrapper[4048]: I0308 03:09:44.952210 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:44.953025 master-0 kubenswrapper[4048]: I0308 03:09:44.952967 4048 scope.go:117] "RemoveContainer" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" Mar 08 03:09:44.954249 master-0 kubenswrapper[4048]: E0308 03:09:44.953266 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:09:44.961548 master-0 kubenswrapper[4048]: E0308 03:09:44.961318 4048 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189abef62f3b22d9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189abef62f3b22d9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:09:27.887348441 +0000 UTC m=+6.852821012,LastTimestamp:2026-03-08 03:09:44.953179129 +0000 UTC m=+23.918651740,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:09:45.388740 master-0 kubenswrapper[4048]: I0308 03:09:45.388642 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:45.388976 master-0 kubenswrapper[4048]: I0308 03:09:45.388818 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:45.390056 master-0 kubenswrapper[4048]: I0308 03:09:45.389971 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:45.390056 master-0 kubenswrapper[4048]: I0308 03:09:45.390037 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:45.390056 master-0 kubenswrapper[4048]: I0308 03:09:45.390058 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:45.395543 master-0 kubenswrapper[4048]: I0308 03:09:45.395456 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:09:45.695309 master-0 kubenswrapper[4048]: I0308 03:09:45.695164 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:45.955790 master-0 kubenswrapper[4048]: I0308 03:09:45.955629 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:09:45.956542 master-0 kubenswrapper[4048]: I0308 03:09:45.956474 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:45.959345 master-0 kubenswrapper[4048]: I0308 03:09:45.959275 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:45.959345 master-0 kubenswrapper[4048]: I0308 03:09:45.959332 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:45.959345 master-0 kubenswrapper[4048]: I0308 03:09:45.959349 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:46.694976 master-0 kubenswrapper[4048]: I0308 03:09:46.694881 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:47.696818 master-0 kubenswrapper[4048]: I0308 03:09:47.696742 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:48.335545 master-0 kubenswrapper[4048]: E0308 03:09:48.335428 4048 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 03:09:48.573870 master-0 kubenswrapper[4048]: I0308 03:09:48.573753 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:48.575287 master-0 kubenswrapper[4048]: I0308 03:09:48.575202 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:48.575287 master-0 kubenswrapper[4048]: I0308 03:09:48.575250 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:48.575287 master-0 kubenswrapper[4048]: I0308 03:09:48.575264 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:48.575696 master-0 kubenswrapper[4048]: I0308 03:09:48.575322 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:48.582997 master-0 kubenswrapper[4048]: E0308 03:09:48.582925 4048 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 08 03:09:48.695870 master-0 kubenswrapper[4048]: I0308 03:09:48.695790 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:49.696821 master-0 kubenswrapper[4048]: I0308 03:09:49.696722 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:50.696979 master-0 kubenswrapper[4048]: I0308 03:09:50.696918 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:51.696795 master-0 kubenswrapper[4048]: I0308 03:09:51.696706 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:51.840286 master-0 kubenswrapper[4048]: E0308 03:09:51.840183 4048 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:09:52.193254 master-0 kubenswrapper[4048]: W0308 03:09:52.193183 4048 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 03:09:52.193254 master-0 kubenswrapper[4048]: E0308 03:09:52.193254 4048 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 03:09:52.696354 master-0 kubenswrapper[4048]: I0308 03:09:52.696240 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:53.698051 master-0 kubenswrapper[4048]: I0308 03:09:53.697925 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:54.699202 master-0 kubenswrapper[4048]: I0308 03:09:54.698836 4048 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 03:09:55.251764 master-0 kubenswrapper[4048]: I0308 03:09:55.251693 4048 csr.go:257] certificate signing request csr-sjvrj is issued Mar 08 03:09:55.348842 master-0 kubenswrapper[4048]: E0308 03:09:55.348783 4048 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 08 03:09:55.569327 master-0 kubenswrapper[4048]: I0308 03:09:55.569125 4048 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 03:09:55.583593 master-0 kubenswrapper[4048]: I0308 03:09:55.583525 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:55.585032 master-0 kubenswrapper[4048]: I0308 03:09:55.584982 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:55.585032 master-0 kubenswrapper[4048]: I0308 03:09:55.585030 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:55.585032 master-0 kubenswrapper[4048]: I0308 03:09:55.585042 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:55.585284 master-0 kubenswrapper[4048]: I0308 03:09:55.585096 4048 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:09:55.603280 master-0 kubenswrapper[4048]: I0308 03:09:55.603238 4048 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:09:55.603621 master-0 kubenswrapper[4048]: E0308 03:09:55.603596 4048 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 08 03:09:55.615753 master-0 kubenswrapper[4048]: E0308 03:09:55.615718 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:55.716439 master-0 kubenswrapper[4048]: E0308 03:09:55.716377 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:55.726789 master-0 kubenswrapper[4048]: I0308 03:09:55.726739 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 03:09:55.749514 master-0 kubenswrapper[4048]: I0308 03:09:55.749454 4048 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 03:09:55.817206 master-0 kubenswrapper[4048]: E0308 03:09:55.817154 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:55.917884 master-0 kubenswrapper[4048]: E0308 03:09:55.917835 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.018767 master-0 kubenswrapper[4048]: E0308 03:09:56.018735 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.119913 master-0 kubenswrapper[4048]: E0308 03:09:56.119873 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.220676 master-0 kubenswrapper[4048]: E0308 03:09:56.220509 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.253827 master-0 kubenswrapper[4048]: I0308 03:09:56.253762 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 23:13:19.475727176 +0000 UTC Mar 08 03:09:56.254069 master-0 kubenswrapper[4048]: I0308 03:09:56.254045 4048 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h3m23.221690564s for next certificate rotation Mar 08 03:09:56.321584 master-0 kubenswrapper[4048]: E0308 03:09:56.321533 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.422560 master-0 kubenswrapper[4048]: E0308 03:09:56.422447 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.523394 master-0 kubenswrapper[4048]: E0308 03:09:56.523271 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.624438 master-0 kubenswrapper[4048]: E0308 03:09:56.624400 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.725633 master-0 kubenswrapper[4048]: E0308 03:09:56.725550 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.826187 master-0 kubenswrapper[4048]: E0308 03:09:56.826050 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:56.858774 master-0 kubenswrapper[4048]: I0308 03:09:56.858736 4048 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:09:56.860242 master-0 kubenswrapper[4048]: I0308 03:09:56.860190 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:09:56.860365 master-0 kubenswrapper[4048]: I0308 03:09:56.860250 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:09:56.860365 master-0 kubenswrapper[4048]: I0308 03:09:56.860266 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:09:56.860825 master-0 kubenswrapper[4048]: I0308 03:09:56.860793 4048 scope.go:117] "RemoveContainer" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" Mar 08 03:09:56.861043 master-0 kubenswrapper[4048]: E0308 03:09:56.861003 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 08 03:09:56.926769 master-0 kubenswrapper[4048]: E0308 03:09:56.926714 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.027889 master-0 kubenswrapper[4048]: E0308 03:09:57.027786 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.129029 master-0 kubenswrapper[4048]: E0308 03:09:57.128970 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.229173 master-0 kubenswrapper[4048]: E0308 03:09:57.229106 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.330007 master-0 kubenswrapper[4048]: E0308 03:09:57.329915 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.431035 master-0 kubenswrapper[4048]: E0308 03:09:57.430846 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.531749 master-0 kubenswrapper[4048]: E0308 03:09:57.531666 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.632640 master-0 kubenswrapper[4048]: E0308 03:09:57.632519 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.733087 master-0 kubenswrapper[4048]: E0308 03:09:57.732880 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.833766 master-0 kubenswrapper[4048]: E0308 03:09:57.833663 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:57.933898 master-0 kubenswrapper[4048]: E0308 03:09:57.933821 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:58.034969 master-0 kubenswrapper[4048]: E0308 03:09:58.034828 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:58.135854 master-0 kubenswrapper[4048]: E0308 03:09:58.135778 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:58.236866 master-0 kubenswrapper[4048]: E0308 03:09:58.236788 4048 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:09:58.254082 master-0 kubenswrapper[4048]: I0308 03:09:58.254032 4048 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:09:58.693708 master-0 kubenswrapper[4048]: I0308 03:09:58.693643 4048 apiserver.go:52] "Watching apiserver" Mar 08 03:09:58.699146 master-0 kubenswrapper[4048]: I0308 03:09:58.699103 4048 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:09:58.699305 master-0 kubenswrapper[4048]: I0308 03:09:58.699259 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq","openshift-network-operator/network-operator-7c649bf6d4-98n6d"] Mar 08 03:09:58.699759 master-0 kubenswrapper[4048]: I0308 03:09:58.699718 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.700184 master-0 kubenswrapper[4048]: I0308 03:09:58.699736 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.702071 master-0 kubenswrapper[4048]: I0308 03:09:58.702037 4048 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:09:58.703829 master-0 kubenswrapper[4048]: I0308 03:09:58.703792 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:09:58.704025 master-0 kubenswrapper[4048]: I0308 03:09:58.703800 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:09:58.704122 master-0 kubenswrapper[4048]: I0308 03:09:58.703854 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:09:58.704122 master-0 kubenswrapper[4048]: I0308 03:09:58.704098 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:09:58.704674 master-0 kubenswrapper[4048]: I0308 03:09:58.704642 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:09:58.705282 master-0 kubenswrapper[4048]: I0308 03:09:58.705231 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:09:58.791968 master-0 kubenswrapper[4048]: I0308 03:09:58.791881 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.791968 master-0 kubenswrapper[4048]: I0308 03:09:58.791945 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.791968 master-0 kubenswrapper[4048]: I0308 03:09:58.791978 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.792910 master-0 kubenswrapper[4048]: I0308 03:09:58.792014 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.792910 master-0 kubenswrapper[4048]: I0308 03:09:58.792049 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.792910 master-0 kubenswrapper[4048]: I0308 03:09:58.792083 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.792910 master-0 kubenswrapper[4048]: I0308 03:09:58.792113 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.792910 master-0 kubenswrapper[4048]: I0308 03:09:58.792149 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.892836 master-0 kubenswrapper[4048]: I0308 03:09:58.892727 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.893175 master-0 kubenswrapper[4048]: I0308 03:09:58.892799 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.893175 master-0 kubenswrapper[4048]: I0308 03:09:58.892990 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.893354 master-0 kubenswrapper[4048]: I0308 03:09:58.893092 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.893652 master-0 kubenswrapper[4048]: E0308 03:09:58.893552 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:09:58.893652 master-0 kubenswrapper[4048]: I0308 03:09:58.893610 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.893891 master-0 kubenswrapper[4048]: E0308 03:09:58.893798 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:09:59.393651799 +0000 UTC m=+38.359124410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:09:58.894222 master-0 kubenswrapper[4048]: I0308 03:09:58.894039 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.894397 master-0 kubenswrapper[4048]: I0308 03:09:58.894149 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.894397 master-0 kubenswrapper[4048]: I0308 03:09:58.894261 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.895530 master-0 kubenswrapper[4048]: I0308 03:09:58.894661 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.895957 master-0 kubenswrapper[4048]: I0308 03:09:58.895914 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.896223 master-0 kubenswrapper[4048]: I0308 03:09:58.896158 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.896420 master-0 kubenswrapper[4048]: I0308 03:09:58.896261 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.897244 master-0 kubenswrapper[4048]: I0308 03:09:58.897195 4048 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:09:58.906690 master-0 kubenswrapper[4048]: I0308 03:09:58.906528 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:58.922396 master-0 kubenswrapper[4048]: I0308 03:09:58.922298 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:58.925392 master-0 kubenswrapper[4048]: I0308 03:09:58.925334 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:59.034078 master-0 kubenswrapper[4048]: I0308 03:09:59.033884 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:09:59.055748 master-0 kubenswrapper[4048]: W0308 03:09:59.055669 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982ea338_c7be_4776_9bb7_113834c54aaa.slice/crio-e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223 WatchSource:0}: Error finding container e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223: Status 404 returned error can't find the container with id e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223 Mar 08 03:09:59.332049 master-0 kubenswrapper[4048]: I0308 03:09:59.331785 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-9g2h9"] Mar 08 03:09:59.332525 master-0 kubenswrapper[4048]: I0308 03:09:59.332162 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.334783 master-0 kubenswrapper[4048]: I0308 03:09:59.334727 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 08 03:09:59.336546 master-0 kubenswrapper[4048]: I0308 03:09:59.336510 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 08 03:09:59.336924 master-0 kubenswrapper[4048]: I0308 03:09:59.336863 4048 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 08 03:09:59.337110 master-0 kubenswrapper[4048]: I0308 03:09:59.337051 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 08 03:09:59.400599 master-0 kubenswrapper[4048]: I0308 03:09:59.400448 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:09:59.400854 master-0 kubenswrapper[4048]: E0308 03:09:59.400706 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:09:59.400854 master-0 kubenswrapper[4048]: E0308 03:09:59.400835 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:10:00.400806585 +0000 UTC m=+39.366279196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:09:59.500873 master-0 kubenswrapper[4048]: I0308 03:09:59.500786 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.500873 master-0 kubenswrapper[4048]: I0308 03:09:59.500853 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.501121 master-0 kubenswrapper[4048]: I0308 03:09:59.500897 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.501121 master-0 kubenswrapper[4048]: I0308 03:09:59.500951 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.501121 master-0 kubenswrapper[4048]: I0308 03:09:59.500984 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhz4v\" (UniqueName: \"kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.601892 master-0 kubenswrapper[4048]: I0308 03:09:59.601788 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.601892 master-0 kubenswrapper[4048]: I0308 03:09:59.601881 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602278 master-0 kubenswrapper[4048]: I0308 03:09:59.602017 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602278 master-0 kubenswrapper[4048]: I0308 03:09:59.602125 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhz4v\" (UniqueName: \"kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602406 master-0 kubenswrapper[4048]: I0308 03:09:59.602297 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602406 master-0 kubenswrapper[4048]: I0308 03:09:59.602343 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602585 master-0 kubenswrapper[4048]: I0308 03:09:59.602243 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602585 master-0 kubenswrapper[4048]: I0308 03:09:59.602432 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.602585 master-0 kubenswrapper[4048]: I0308 03:09:59.602433 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.632250 master-0 kubenswrapper[4048]: I0308 03:09:59.632190 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhz4v\" (UniqueName: \"kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v\") pod \"assisted-installer-controller-9g2h9\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.672765 master-0 kubenswrapper[4048]: I0308 03:09:59.672710 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:09:59.688388 master-0 kubenswrapper[4048]: W0308 03:09:59.688315 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5abe9d1_62c2_4d7e_9b77_403ea0cfbbf5.slice/crio-e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f WatchSource:0}: Error finding container e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f: Status 404 returned error can't find the container with id e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f Mar 08 03:09:59.753449 master-0 kubenswrapper[4048]: I0308 03:09:59.753383 4048 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:09:59.996017 master-0 kubenswrapper[4048]: I0308 03:09:59.995849 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-9g2h9" event={"ID":"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5","Type":"ContainerStarted","Data":"e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f"} Mar 08 03:09:59.997229 master-0 kubenswrapper[4048]: I0308 03:09:59.997179 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" event={"ID":"982ea338-c7be-4776-9bb7-113834c54aaa","Type":"ContainerStarted","Data":"e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223"} Mar 08 03:10:00.408425 master-0 kubenswrapper[4048]: I0308 03:10:00.408339 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:10:00.408787 master-0 kubenswrapper[4048]: E0308 03:10:00.408606 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:00.408787 master-0 kubenswrapper[4048]: E0308 03:10:00.408732 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:10:02.408704104 +0000 UTC m=+41.374176715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:02.420967 master-0 kubenswrapper[4048]: I0308 03:10:02.420892 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:10:02.422571 master-0 kubenswrapper[4048]: E0308 03:10:02.421085 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:02.422571 master-0 kubenswrapper[4048]: E0308 03:10:02.421168 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:10:06.421149437 +0000 UTC m=+45.386622008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: E0308 03:10:02.644865 4048 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: set -o allexport Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: source /etc/kubernetes/apiserver-url.env Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: else Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: exit 1 Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: fi Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8pfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-98n6d_openshift-network-operator(982ea338-c7be-4776-9bb7-113834c54aaa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 03:10:02.644917 master-0 kubenswrapper[4048]: > logger="UnhandledError" Mar 08 03:10:02.646766 master-0 kubenswrapper[4048]: E0308 03:10:02.646155 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" podUID="982ea338-c7be-4776-9bb7-113834c54aaa" Mar 08 03:10:02.705032 master-0 kubenswrapper[4048]: I0308 03:10:02.704289 4048 csr.go:261] certificate signing request csr-rrqwn is approved, waiting to be issued Mar 08 03:10:02.711853 master-0 kubenswrapper[4048]: I0308 03:10:02.711795 4048 csr.go:257] certificate signing request csr-rrqwn is issued Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: E0308 03:10:03.013746 4048 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: set -o allexport Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: source /etc/kubernetes/apiserver-url.env Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: else Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: exit 1 Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: fi Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8pfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-98n6d_openshift-network-operator(982ea338-c7be-4776-9bb7-113834c54aaa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 03:10:03.013863 master-0 kubenswrapper[4048]: > logger="UnhandledError" Mar 08 03:10:03.015735 master-0 kubenswrapper[4048]: E0308 03:10:03.015093 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" podUID="982ea338-c7be-4776-9bb7-113834c54aaa" Mar 08 03:10:03.713822 master-0 kubenswrapper[4048]: I0308 03:10:03.713766 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 21:06:41.601311289 +0000 UTC Mar 08 03:10:03.713822 master-0 kubenswrapper[4048]: I0308 03:10:03.713800 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h56m37.887513847s for next certificate rotation Mar 08 03:10:04.609137 master-0 kubenswrapper[4048]: E0308 03:10:04.609008 4048 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhz4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-9g2h9_assisted-installer(a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 03:10:04.610357 master-0 kubenswrapper[4048]: E0308 03:10:04.610290 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-9g2h9" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" Mar 08 03:10:04.714073 master-0 kubenswrapper[4048]: I0308 03:10:04.713970 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 21:10:08.064517566 +0000 UTC Mar 08 03:10:04.714073 master-0 kubenswrapper[4048]: I0308 03:10:04.714007 4048 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h0m3.350513165s for next certificate rotation Mar 08 03:10:05.016878 master-0 kubenswrapper[4048]: E0308 03:10:05.016723 4048 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhz4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-9g2h9_assisted-installer(a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 03:10:05.019048 master-0 kubenswrapper[4048]: E0308 03:10:05.018969 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-9g2h9" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" Mar 08 03:10:06.485742 master-0 kubenswrapper[4048]: I0308 03:10:06.485636 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:10:06.486803 master-0 kubenswrapper[4048]: E0308 03:10:06.485819 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:06.486803 master-0 kubenswrapper[4048]: E0308 03:10:06.485930 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:10:14.485902094 +0000 UTC m=+53.451374705 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:06.746768 master-0 kubenswrapper[4048]: I0308 03:10:06.746559 4048 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:10:09.875510 master-0 kubenswrapper[4048]: I0308 03:10:09.875388 4048 scope.go:117] "RemoveContainer" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" Mar 08 03:10:09.876304 master-0 kubenswrapper[4048]: I0308 03:10:09.875996 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 08 03:10:11.029517 master-0 kubenswrapper[4048]: I0308 03:10:11.029189 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:10:11.030586 master-0 kubenswrapper[4048]: I0308 03:10:11.030261 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"6f51a4db85d18d82603b426a557c9c6da1c85541f85af4f912c744b7f3a66c18"} Mar 08 03:10:11.050763 master-0 kubenswrapper[4048]: I0308 03:10:11.050635 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=2.050612272 podStartE2EDuration="2.050612272s" podCreationTimestamp="2026-03-08 03:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:10:11.049781188 +0000 UTC m=+50.015253799" watchObservedRunningTime="2026-03-08 03:10:11.050612272 +0000 UTC m=+50.016084873" Mar 08 03:10:14.537439 master-0 kubenswrapper[4048]: I0308 03:10:14.537328 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:10:14.538476 master-0 kubenswrapper[4048]: E0308 03:10:14.537533 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:14.538476 master-0 kubenswrapper[4048]: E0308 03:10:14.537631 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:10:30.53760356 +0000 UTC m=+69.503076171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:16.866377 master-0 kubenswrapper[4048]: E0308 03:10:16.866204 4048 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhz4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-9g2h9_assisted-installer(a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: E0308 03:10:16.866662 4048 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: set -o allexport Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: source /etc/kubernetes/apiserver-url.env Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: else Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: exit 1 Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: fi Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8pfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-98n6d_openshift-network-operator(982ea338-c7be-4776-9bb7-113834c54aaa): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 03:10:16.867176 master-0 kubenswrapper[4048]: > logger="UnhandledError" Mar 08 03:10:16.867892 master-0 kubenswrapper[4048]: E0308 03:10:16.867563 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-9g2h9" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" Mar 08 03:10:16.868709 master-0 kubenswrapper[4048]: E0308 03:10:16.868655 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" podUID="982ea338-c7be-4776-9bb7-113834c54aaa" Mar 08 03:10:20.765002 master-0 kubenswrapper[4048]: I0308 03:10:20.764903 4048 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:10:28.862963 master-0 kubenswrapper[4048]: I0308 03:10:28.862869 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 08 03:10:28.872777 master-0 kubenswrapper[4048]: I0308 03:10:28.872729 4048 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 08 03:10:29.075322 master-0 kubenswrapper[4048]: I0308 03:10:29.075280 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-9g2h9" event={"ID":"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5","Type":"ContainerStarted","Data":"00d06648e335af10bd876c293f9902417ade2722b0f152f68b636aa5a6ef0592"} Mar 08 03:10:29.078249 master-0 kubenswrapper[4048]: I0308 03:10:29.078222 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" event={"ID":"982ea338-c7be-4776-9bb7-113834c54aaa","Type":"ContainerStarted","Data":"5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56"} Mar 08 03:10:29.095502 master-0 kubenswrapper[4048]: I0308 03:10:29.095418 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="assisted-installer/assisted-installer-controller-9g2h9" podStartSLOduration=299.178435575 podStartE2EDuration="5m4.095401972s" podCreationTimestamp="2026-03-08 03:05:25 +0000 UTC" firstStartedPulling="2026-03-08 03:09:59.691728977 +0000 UTC m=+38.657201588" lastFinishedPulling="2026-03-08 03:10:04.608695374 +0000 UTC m=+43.574167985" observedRunningTime="2026-03-08 03:10:29.094384493 +0000 UTC m=+68.059857104" watchObservedRunningTime="2026-03-08 03:10:29.095401972 +0000 UTC m=+68.060874543" Mar 08 03:10:30.083451 master-0 kubenswrapper[4048]: I0308 03:10:30.083356 4048 generic.go:334] "Generic (PLEG): container finished" podID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerID="00d06648e335af10bd876c293f9902417ade2722b0f152f68b636aa5a6ef0592" exitCode=0 Mar 08 03:10:30.083451 master-0 kubenswrapper[4048]: I0308 03:10:30.083404 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-9g2h9" event={"ID":"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5","Type":"ContainerDied","Data":"00d06648e335af10bd876c293f9902417ade2722b0f152f68b636aa5a6ef0592"} Mar 08 03:10:30.105233 master-0 kubenswrapper[4048]: I0308 03:10:30.105114 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" podStartSLOduration=31.520470789 podStartE2EDuration="35.105078462s" podCreationTimestamp="2026-03-08 03:09:55 +0000 UTC" firstStartedPulling="2026-03-08 03:09:59.058956104 +0000 UTC m=+38.024428715" lastFinishedPulling="2026-03-08 03:10:02.643563817 +0000 UTC m=+41.609036388" observedRunningTime="2026-03-08 03:10:29.112898413 +0000 UTC m=+68.078371004" watchObservedRunningTime="2026-03-08 03:10:30.105078462 +0000 UTC m=+69.070551073" Mar 08 03:10:30.553300 master-0 kubenswrapper[4048]: I0308 03:10:30.553187 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:10:30.553608 master-0 kubenswrapper[4048]: E0308 03:10:30.553455 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:30.553689 master-0 kubenswrapper[4048]: E0308 03:10:30.553611 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:02.553557708 +0000 UTC m=+101.519030319 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:10:30.906702 master-0 kubenswrapper[4048]: I0308 03:10:30.906631 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-lshxs"] Mar 08 03:10:30.907046 master-0 kubenswrapper[4048]: I0308 03:10:30.906987 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:31.057899 master-0 kubenswrapper[4048]: I0308 03:10:31.057789 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2x7c\" (UniqueName: \"kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c\") pod \"mtu-prober-lshxs\" (UID: \"1bbf59e4-f202-4f7c-9f45-0d07de8e6447\") " pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:31.116842 master-0 kubenswrapper[4048]: I0308 03:10:31.116805 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158018 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf\") pod \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158087 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf\") pod \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158127 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhz4v\" (UniqueName: \"kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v\") pod \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158160 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files\") pod \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158189 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle\") pod \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\" (UID: \"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5\") " Mar 08 03:10:31.158223 master-0 kubenswrapper[4048]: I0308 03:10:31.158209 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" (UID: "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:10:31.160043 master-0 kubenswrapper[4048]: I0308 03:10:31.158274 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2x7c\" (UniqueName: \"kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c\") pod \"mtu-prober-lshxs\" (UID: \"1bbf59e4-f202-4f7c-9f45-0d07de8e6447\") " pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:31.160043 master-0 kubenswrapper[4048]: I0308 03:10:31.158300 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" (UID: "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:10:31.160043 master-0 kubenswrapper[4048]: I0308 03:10:31.158298 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" (UID: "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:10:31.160043 master-0 kubenswrapper[4048]: I0308 03:10:31.158340 4048 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:31.160043 master-0 kubenswrapper[4048]: I0308 03:10:31.158296 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" (UID: "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:10:31.166113 master-0 kubenswrapper[4048]: I0308 03:10:31.166018 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v" (OuterVolumeSpecName: "kube-api-access-vhz4v") pod "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" (UID: "a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5"). InnerVolumeSpecName "kube-api-access-vhz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:10:31.179921 master-0 kubenswrapper[4048]: I0308 03:10:31.179868 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2x7c\" (UniqueName: \"kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c\") pod \"mtu-prober-lshxs\" (UID: \"1bbf59e4-f202-4f7c-9f45-0d07de8e6447\") " pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:31.228591 master-0 kubenswrapper[4048]: I0308 03:10:31.228446 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:31.246132 master-0 kubenswrapper[4048]: W0308 03:10:31.246077 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbf59e4_f202_4f7c_9f45_0d07de8e6447.slice/crio-08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a WatchSource:0}: Error finding container 08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a: Status 404 returned error can't find the container with id 08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a Mar 08 03:10:31.258693 master-0 kubenswrapper[4048]: I0308 03:10:31.258643 4048 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:31.258849 master-0 kubenswrapper[4048]: I0308 03:10:31.258699 4048 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhz4v\" (UniqueName: \"kubernetes.io/projected/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-kube-api-access-vhz4v\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:31.258849 master-0 kubenswrapper[4048]: I0308 03:10:31.258733 4048 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:31.258849 master-0 kubenswrapper[4048]: I0308 03:10:31.258758 4048 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:32.091980 master-0 kubenswrapper[4048]: I0308 03:10:32.091897 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-9g2h9" event={"ID":"a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5","Type":"ContainerDied","Data":"e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f"} Mar 08 03:10:32.092383 master-0 kubenswrapper[4048]: I0308 03:10:32.092346 4048 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f" Mar 08 03:10:32.092699 master-0 kubenswrapper[4048]: I0308 03:10:32.092669 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:10:32.094885 master-0 kubenswrapper[4048]: I0308 03:10:32.094818 4048 generic.go:334] "Generic (PLEG): container finished" podID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerID="6f47524787fe6d12f2f00918cc138535f7c801d780aa325200500bc9264d2c6c" exitCode=0 Mar 08 03:10:32.094885 master-0 kubenswrapper[4048]: I0308 03:10:32.094877 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-lshxs" event={"ID":"1bbf59e4-f202-4f7c-9f45-0d07de8e6447","Type":"ContainerDied","Data":"6f47524787fe6d12f2f00918cc138535f7c801d780aa325200500bc9264d2c6c"} Mar 08 03:10:32.095109 master-0 kubenswrapper[4048]: I0308 03:10:32.094917 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-lshxs" event={"ID":"1bbf59e4-f202-4f7c-9f45-0d07de8e6447","Type":"ContainerStarted","Data":"08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a"} Mar 08 03:10:33.111774 master-0 kubenswrapper[4048]: I0308 03:10:33.111713 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:33.172235 master-0 kubenswrapper[4048]: I0308 03:10:33.172140 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2x7c\" (UniqueName: \"kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c\") pod \"1bbf59e4-f202-4f7c-9f45-0d07de8e6447\" (UID: \"1bbf59e4-f202-4f7c-9f45-0d07de8e6447\") " Mar 08 03:10:33.177365 master-0 kubenswrapper[4048]: I0308 03:10:33.177287 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c" (OuterVolumeSpecName: "kube-api-access-j2x7c") pod "1bbf59e4-f202-4f7c-9f45-0d07de8e6447" (UID: "1bbf59e4-f202-4f7c-9f45-0d07de8e6447"). InnerVolumeSpecName "kube-api-access-j2x7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:10:33.272696 master-0 kubenswrapper[4048]: I0308 03:10:33.272623 4048 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2x7c\" (UniqueName: \"kubernetes.io/projected/1bbf59e4-f202-4f7c-9f45-0d07de8e6447-kube-api-access-j2x7c\") on node \"master-0\" DevicePath \"\"" Mar 08 03:10:34.100227 master-0 kubenswrapper[4048]: I0308 03:10:34.099905 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-lshxs" event={"ID":"1bbf59e4-f202-4f7c-9f45-0d07de8e6447","Type":"ContainerDied","Data":"08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a"} Mar 08 03:10:34.100227 master-0 kubenswrapper[4048]: I0308 03:10:34.100172 4048 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a" Mar 08 03:10:34.100227 master-0 kubenswrapper[4048]: I0308 03:10:34.099987 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-lshxs" Mar 08 03:10:35.873290 master-0 kubenswrapper[4048]: I0308 03:10:35.873213 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:10:35.931544 master-0 kubenswrapper[4048]: I0308 03:10:35.931429 4048 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-lshxs"] Mar 08 03:10:35.935557 master-0 kubenswrapper[4048]: I0308 03:10:35.935468 4048 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-lshxs"] Mar 08 03:10:37.865648 master-0 kubenswrapper[4048]: I0308 03:10:37.865541 4048 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" path="/var/lib/kubelet/pods/1bbf59e4-f202-4f7c-9f45-0d07de8e6447/volumes" Mar 08 03:10:41.024464 master-0 kubenswrapper[4048]: I0308 03:10:41.024384 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hfnwm"] Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: E0308 03:10:41.024530 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: I0308 03:10:41.024551 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: E0308 03:10:41.024566 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: I0308 03:10:41.024580 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: I0308 03:10:41.024620 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: I0308 03:10:41.024636 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:10:41.025635 master-0 kubenswrapper[4048]: I0308 03:10:41.024912 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.027868 master-0 kubenswrapper[4048]: I0308 03:10:41.027834 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:10:41.028803 master-0 kubenswrapper[4048]: I0308 03:10:41.028753 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:10:41.029057 master-0 kubenswrapper[4048]: I0308 03:10:41.028873 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:10:41.038652 master-0 kubenswrapper[4048]: I0308 03:10:41.038590 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:10:41.056591 master-0 kubenswrapper[4048]: I0308 03:10:41.056531 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5qjn5"] Mar 08 03:10:41.057322 master-0 kubenswrapper[4048]: I0308 03:10:41.057282 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.059200 master-0 kubenswrapper[4048]: I0308 03:10:41.059153 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:10:41.059730 master-0 kubenswrapper[4048]: I0308 03:10:41.059669 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:10:41.073918 master-0 kubenswrapper[4048]: I0308 03:10:41.073650 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=6.073625733 podStartE2EDuration="6.073625733s" podCreationTimestamp="2026-03-08 03:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:10:41.073422957 +0000 UTC m=+80.038895538" watchObservedRunningTime="2026-03-08 03:10:41.073625733 +0000 UTC m=+80.039098344" Mar 08 03:10:41.128293 master-0 kubenswrapper[4048]: I0308 03:10:41.128241 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128293 master-0 kubenswrapper[4048]: I0308 03:10:41.128291 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.128659 master-0 kubenswrapper[4048]: I0308 03:10:41.128316 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.128659 master-0 kubenswrapper[4048]: I0308 03:10:41.128436 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.128659 master-0 kubenswrapper[4048]: I0308 03:10:41.128575 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128659 master-0 kubenswrapper[4048]: I0308 03:10:41.128612 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128659 master-0 kubenswrapper[4048]: I0308 03:10:41.128643 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.128958 master-0 kubenswrapper[4048]: I0308 03:10:41.128684 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128958 master-0 kubenswrapper[4048]: I0308 03:10:41.128720 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128958 master-0 kubenswrapper[4048]: I0308 03:10:41.128747 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128958 master-0 kubenswrapper[4048]: I0308 03:10:41.128777 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.128958 master-0 kubenswrapper[4048]: I0308 03:10:41.128814 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.128946 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129011 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129055 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129086 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129140 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129171 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129201 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129231 master-0 kubenswrapper[4048]: I0308 03:10:41.129233 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129747 master-0 kubenswrapper[4048]: I0308 03:10:41.129332 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129747 master-0 kubenswrapper[4048]: I0308 03:10:41.129365 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129747 master-0 kubenswrapper[4048]: I0308 03:10:41.129393 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.129747 master-0 kubenswrapper[4048]: I0308 03:10:41.129422 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.230735 master-0 kubenswrapper[4048]: I0308 03:10:41.230607 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.230735 master-0 kubenswrapper[4048]: I0308 03:10:41.230737 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231081 master-0 kubenswrapper[4048]: I0308 03:10:41.230765 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231081 master-0 kubenswrapper[4048]: I0308 03:10:41.230861 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231081 master-0 kubenswrapper[4048]: I0308 03:10:41.230957 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231081 master-0 kubenswrapper[4048]: I0308 03:10:41.231006 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231081 master-0 kubenswrapper[4048]: I0308 03:10:41.231057 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231113 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231154 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231173 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231205 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231226 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231366 master-0 kubenswrapper[4048]: I0308 03:10:41.231323 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231754 master-0 kubenswrapper[4048]: I0308 03:10:41.231613 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231754 master-0 kubenswrapper[4048]: I0308 03:10:41.231666 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231754 master-0 kubenswrapper[4048]: I0308 03:10:41.231699 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231754 master-0 kubenswrapper[4048]: I0308 03:10:41.231727 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231758 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231792 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231820 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231857 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231894 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231932 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.231974 master-0 kubenswrapper[4048]: I0308 03:10:41.231963 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.231992 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232021 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232056 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232084 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232116 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232146 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232177 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232347 master-0 kubenswrapper[4048]: I0308 03:10:41.232255 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.232820 master-0 kubenswrapper[4048]: I0308 03:10:41.232681 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.232820 master-0 kubenswrapper[4048]: I0308 03:10:41.232713 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.232820 master-0 kubenswrapper[4048]: I0308 03:10:41.232712 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.232820 master-0 kubenswrapper[4048]: I0308 03:10:41.232802 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233040 master-0 kubenswrapper[4048]: I0308 03:10:41.232865 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233040 master-0 kubenswrapper[4048]: I0308 03:10:41.232919 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.233040 master-0 kubenswrapper[4048]: I0308 03:10:41.232930 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233391 master-0 kubenswrapper[4048]: I0308 03:10:41.233123 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233391 master-0 kubenswrapper[4048]: I0308 03:10:41.233272 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233845 master-0 kubenswrapper[4048]: I0308 03:10:41.233753 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.233944 master-0 kubenswrapper[4048]: I0308 03:10:41.233887 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.234006 master-0 kubenswrapper[4048]: I0308 03:10:41.233947 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.234068 master-0 kubenswrapper[4048]: I0308 03:10:41.233969 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.234510 master-0 kubenswrapper[4048]: I0308 03:10:41.234450 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.234792 master-0 kubenswrapper[4048]: I0308 03:10:41.234747 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.234944 master-0 kubenswrapper[4048]: I0308 03:10:41.234903 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.262416 master-0 kubenswrapper[4048]: I0308 03:10:41.262344 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.358352 master-0 kubenswrapper[4048]: I0308 03:10:41.358265 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.359012 master-0 kubenswrapper[4048]: I0308 03:10:41.358956 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hfnwm" Mar 08 03:10:41.378119 master-0 kubenswrapper[4048]: W0308 03:10:41.378057 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9949f9f4_00f3_4ac8_b8a2_a9549693f5b1.slice/crio-0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79 WatchSource:0}: Error finding container 0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79: Status 404 returned error can't find the container with id 0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79 Mar 08 03:10:41.387357 master-0 kubenswrapper[4048]: I0308 03:10:41.387183 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.672997 master-0 kubenswrapper[4048]: I0308 03:10:41.672862 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:10:41.690030 master-0 kubenswrapper[4048]: W0308 03:10:41.689963 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ceb013_e999_4f15_bf25_f8dcd2647f9f.slice/crio-c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929 WatchSource:0}: Error finding container c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929: Status 404 returned error can't find the container with id c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929 Mar 08 03:10:41.783936 master-0 kubenswrapper[4048]: I0308 03:10:41.783835 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jl9tj"] Mar 08 03:10:41.784671 master-0 kubenswrapper[4048]: I0308 03:10:41.784631 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:41.784824 master-0 kubenswrapper[4048]: E0308 03:10:41.784756 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:41.863735 master-0 kubenswrapper[4048]: I0308 03:10:41.863602 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:41.863735 master-0 kubenswrapper[4048]: I0308 03:10:41.863718 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:41.872218 master-0 kubenswrapper[4048]: W0308 03:10:41.872162 4048 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:10:41.872427 master-0 kubenswrapper[4048]: I0308 03:10:41.872347 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:10:41.964452 master-0 kubenswrapper[4048]: I0308 03:10:41.964282 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:41.964452 master-0 kubenswrapper[4048]: I0308 03:10:41.964397 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:41.964856 master-0 kubenswrapper[4048]: E0308 03:10:41.964790 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:41.964954 master-0 kubenswrapper[4048]: E0308 03:10:41.964901 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:10:42.464875316 +0000 UTC m=+81.430347897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:41.998647 master-0 kubenswrapper[4048]: I0308 03:10:41.993203 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:42.120350 master-0 kubenswrapper[4048]: I0308 03:10:42.120255 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerStarted","Data":"c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929"} Mar 08 03:10:42.122701 master-0 kubenswrapper[4048]: I0308 03:10:42.122653 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hfnwm" event={"ID":"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1","Type":"ContainerStarted","Data":"0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79"} Mar 08 03:10:42.469187 master-0 kubenswrapper[4048]: I0308 03:10:42.469111 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:42.469462 master-0 kubenswrapper[4048]: E0308 03:10:42.469245 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:42.469462 master-0 kubenswrapper[4048]: E0308 03:10:42.469297 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:10:43.469280004 +0000 UTC m=+82.434752595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:43.478375 master-0 kubenswrapper[4048]: I0308 03:10:43.478334 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:43.479661 master-0 kubenswrapper[4048]: E0308 03:10:43.478426 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:43.479661 master-0 kubenswrapper[4048]: E0308 03:10:43.478471 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:10:45.478458389 +0000 UTC m=+84.443930960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:43.858587 master-0 kubenswrapper[4048]: I0308 03:10:43.858546 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:43.858879 master-0 kubenswrapper[4048]: E0308 03:10:43.858850 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:43.871324 master-0 kubenswrapper[4048]: I0308 03:10:43.871265 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:10:45.131454 master-0 kubenswrapper[4048]: I0308 03:10:45.131366 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="4aa5fc291dd0b6e7ec288140975372ce39389e86edf66a268784556c20872aa9" exitCode=0 Mar 08 03:10:45.131454 master-0 kubenswrapper[4048]: I0308 03:10:45.131406 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"4aa5fc291dd0b6e7ec288140975372ce39389e86edf66a268784556c20872aa9"} Mar 08 03:10:45.148561 master-0 kubenswrapper[4048]: I0308 03:10:45.147719 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=2.147690075 podStartE2EDuration="2.147690075s" podCreationTimestamp="2026-03-08 03:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:10:45.146959424 +0000 UTC m=+84.112432015" watchObservedRunningTime="2026-03-08 03:10:45.147690075 +0000 UTC m=+84.113162676" Mar 08 03:10:45.160043 master-0 kubenswrapper[4048]: I0308 03:10:45.159890 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=4.159860464 podStartE2EDuration="4.159860464s" podCreationTimestamp="2026-03-08 03:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:10:45.159135053 +0000 UTC m=+84.124607664" watchObservedRunningTime="2026-03-08 03:10:45.159860464 +0000 UTC m=+84.125333075" Mar 08 03:10:45.494040 master-0 kubenswrapper[4048]: I0308 03:10:45.493875 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:45.494040 master-0 kubenswrapper[4048]: E0308 03:10:45.494005 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:45.494375 master-0 kubenswrapper[4048]: E0308 03:10:45.494059 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:10:49.494042354 +0000 UTC m=+88.459514935 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:45.858684 master-0 kubenswrapper[4048]: I0308 03:10:45.858594 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:45.858956 master-0 kubenswrapper[4048]: E0308 03:10:45.858779 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:47.858813 master-0 kubenswrapper[4048]: I0308 03:10:47.858762 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:47.859397 master-0 kubenswrapper[4048]: E0308 03:10:47.858983 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:49.550867 master-0 kubenswrapper[4048]: I0308 03:10:49.550802 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:49.551448 master-0 kubenswrapper[4048]: E0308 03:10:49.551047 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:49.551448 master-0 kubenswrapper[4048]: E0308 03:10:49.551212 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:10:57.551178285 +0000 UTC m=+96.516650896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:49.858631 master-0 kubenswrapper[4048]: I0308 03:10:49.858595 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:49.859034 master-0 kubenswrapper[4048]: E0308 03:10:49.858997 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:51.858265 master-0 kubenswrapper[4048]: I0308 03:10:51.858178 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:51.859193 master-0 kubenswrapper[4048]: E0308 03:10:51.858526 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:53.191971 master-0 kubenswrapper[4048]: I0308 03:10:53.190321 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742"] Mar 08 03:10:53.191971 master-0 kubenswrapper[4048]: I0308 03:10:53.190724 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.198681 master-0 kubenswrapper[4048]: I0308 03:10:53.195770 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:10:53.198681 master-0 kubenswrapper[4048]: I0308 03:10:53.195960 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:10:53.198681 master-0 kubenswrapper[4048]: I0308 03:10:53.195998 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:10:53.198681 master-0 kubenswrapper[4048]: I0308 03:10:53.196169 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:10:53.198681 master-0 kubenswrapper[4048]: I0308 03:10:53.196523 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:10:53.282695 master-0 kubenswrapper[4048]: I0308 03:10:53.282631 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.283012 master-0 kubenswrapper[4048]: I0308 03:10:53.282738 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.283012 master-0 kubenswrapper[4048]: I0308 03:10:53.282818 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.283012 master-0 kubenswrapper[4048]: I0308 03:10:53.282884 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.383663 master-0 kubenswrapper[4048]: I0308 03:10:53.383614 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.384361 master-0 kubenswrapper[4048]: I0308 03:10:53.384325 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.384824 master-0 kubenswrapper[4048]: I0308 03:10:53.384798 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.384906 master-0 kubenswrapper[4048]: I0308 03:10:53.384886 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.384969 master-0 kubenswrapper[4048]: I0308 03:10:53.384945 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.385025 master-0 kubenswrapper[4048]: I0308 03:10:53.384997 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.390959 master-0 kubenswrapper[4048]: I0308 03:10:53.390927 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.406521 master-0 kubenswrapper[4048]: I0308 03:10:53.404193 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.407427 master-0 kubenswrapper[4048]: I0308 03:10:53.407262 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mxjn"] Mar 08 03:10:53.414656 master-0 kubenswrapper[4048]: I0308 03:10:53.414579 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.416257 master-0 kubenswrapper[4048]: I0308 03:10:53.416213 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:10:53.416760 master-0 kubenswrapper[4048]: I0308 03:10:53.416728 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:10:53.485370 master-0 kubenswrapper[4048]: I0308 03:10:53.485253 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485370 master-0 kubenswrapper[4048]: I0308 03:10:53.485300 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485370 master-0 kubenswrapper[4048]: I0308 03:10:53.485323 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485370 master-0 kubenswrapper[4048]: I0308 03:10:53.485341 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485370 master-0 kubenswrapper[4048]: I0308 03:10:53.485364 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485383 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485436 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485471 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485529 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485553 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485584 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485617 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485647 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485674 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485696 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.485734 master-0 kubenswrapper[4048]: I0308 03:10:53.485732 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.486116 master-0 kubenswrapper[4048]: I0308 03:10:53.485752 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fjkw\" (UniqueName: \"kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.486116 master-0 kubenswrapper[4048]: I0308 03:10:53.485774 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.486116 master-0 kubenswrapper[4048]: I0308 03:10:53.485804 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.486116 master-0 kubenswrapper[4048]: I0308 03:10:53.485826 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.504891 master-0 kubenswrapper[4048]: I0308 03:10:53.504789 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:10:53.587111 master-0 kubenswrapper[4048]: I0308 03:10:53.587058 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587111 master-0 kubenswrapper[4048]: I0308 03:10:53.587108 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587143 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587147 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587170 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587203 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587237 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587258 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587261 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587286 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587291 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587334 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587370 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587398 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587429 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587436 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587511 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587513 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587540 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.587562 master-0 kubenswrapper[4048]: I0308 03:10:53.587577 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587584 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587511 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587617 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587642 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587661 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587677 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587703 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587737 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587761 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.587780 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fjkw\" (UniqueName: \"kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588446 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588517 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588556 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588591 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588641 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588684 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.588799 master-0 kubenswrapper[4048]: I0308 03:10:53.588686 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.590181 master-0 kubenswrapper[4048]: I0308 03:10:53.589897 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.603267 master-0 kubenswrapper[4048]: I0308 03:10:53.603188 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.608579 master-0 kubenswrapper[4048]: I0308 03:10:53.608534 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fjkw\" (UniqueName: \"kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw\") pod \"ovnkube-node-9mxjn\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.728447 master-0 kubenswrapper[4048]: I0308 03:10:53.728362 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:10:53.862951 master-0 kubenswrapper[4048]: I0308 03:10:53.862877 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:53.863363 master-0 kubenswrapper[4048]: E0308 03:10:53.863046 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:54.509687 master-0 kubenswrapper[4048]: W0308 03:10:54.509633 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6612131e_f8b4_43cb_9031_251ac924de96.slice/crio-d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de WatchSource:0}: Error finding container d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de: Status 404 returned error can't find the container with id d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de Mar 08 03:10:54.512027 master-0 kubenswrapper[4048]: W0308 03:10:54.511939 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd358134e_5625_492c_b4f7_460798631270.slice/crio-02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a WatchSource:0}: Error finding container 02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a: Status 404 returned error can't find the container with id 02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a Mar 08 03:10:55.155717 master-0 kubenswrapper[4048]: I0308 03:10:55.153750 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="2adae51f407e15a120ce855a4a69d4bbd243779881704875d67dd256bba0227a" exitCode=0 Mar 08 03:10:55.155717 master-0 kubenswrapper[4048]: I0308 03:10:55.153868 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"2adae51f407e15a120ce855a4a69d4bbd243779881704875d67dd256bba0227a"} Mar 08 03:10:55.158561 master-0 kubenswrapper[4048]: I0308 03:10:55.158426 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" event={"ID":"d358134e-5625-492c-b4f7-460798631270","Type":"ContainerStarted","Data":"52b88723004521dea3092f53f41c4bf532db2cb01970f3d42c58da3ec13e1500"} Mar 08 03:10:55.158673 master-0 kubenswrapper[4048]: I0308 03:10:55.158612 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" event={"ID":"d358134e-5625-492c-b4f7-460798631270","Type":"ContainerStarted","Data":"02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a"} Mar 08 03:10:55.163088 master-0 kubenswrapper[4048]: I0308 03:10:55.160328 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de"} Mar 08 03:10:55.167302 master-0 kubenswrapper[4048]: I0308 03:10:55.167245 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hfnwm" event={"ID":"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1","Type":"ContainerStarted","Data":"55611d76e47a42fb9153193806551f833dced3bbea90161e7e029adc748235f3"} Mar 08 03:10:55.204395 master-0 kubenswrapper[4048]: I0308 03:10:55.203273 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hfnwm" podStartSLOduration=0.993364719 podStartE2EDuration="14.203247413s" podCreationTimestamp="2026-03-08 03:10:41 +0000 UTC" firstStartedPulling="2026-03-08 03:10:41.383512757 +0000 UTC m=+80.348985368" lastFinishedPulling="2026-03-08 03:10:54.593395491 +0000 UTC m=+93.558868062" observedRunningTime="2026-03-08 03:10:55.202915484 +0000 UTC m=+94.168388055" watchObservedRunningTime="2026-03-08 03:10:55.203247413 +0000 UTC m=+94.168720014" Mar 08 03:10:55.862652 master-0 kubenswrapper[4048]: I0308 03:10:55.862600 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:55.863316 master-0 kubenswrapper[4048]: E0308 03:10:55.862763 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:56.385025 master-0 kubenswrapper[4048]: I0308 03:10:56.384923 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-l5x6h"] Mar 08 03:10:56.386016 master-0 kubenswrapper[4048]: I0308 03:10:56.385969 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:56.386121 master-0 kubenswrapper[4048]: E0308 03:10:56.386064 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:10:56.514945 master-0 kubenswrapper[4048]: I0308 03:10:56.514856 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:56.616156 master-0 kubenswrapper[4048]: I0308 03:10:56.616088 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:56.628228 master-0 kubenswrapper[4048]: E0308 03:10:56.628199 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:10:56.628228 master-0 kubenswrapper[4048]: E0308 03:10:56.628227 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:10:56.628326 master-0 kubenswrapper[4048]: E0308 03:10:56.628238 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:56.628326 master-0 kubenswrapper[4048]: E0308 03:10:56.628280 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:10:57.128266982 +0000 UTC m=+96.093739553 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:57.221049 master-0 kubenswrapper[4048]: I0308 03:10:57.220971 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:57.222078 master-0 kubenswrapper[4048]: E0308 03:10:57.221121 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:10:57.222078 master-0 kubenswrapper[4048]: E0308 03:10:57.221140 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:10:57.222078 master-0 kubenswrapper[4048]: E0308 03:10:57.221152 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:57.222078 master-0 kubenswrapper[4048]: E0308 03:10:57.221194 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:10:58.221181322 +0000 UTC m=+97.186653893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:57.624268 master-0 kubenswrapper[4048]: I0308 03:10:57.624213 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:57.624470 master-0 kubenswrapper[4048]: E0308 03:10:57.624406 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:57.624543 master-0 kubenswrapper[4048]: E0308 03:10:57.624507 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:13.624466816 +0000 UTC m=+112.589939387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:10:57.858541 master-0 kubenswrapper[4048]: I0308 03:10:57.858303 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:57.858541 master-0 kubenswrapper[4048]: E0308 03:10:57.858474 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:57.859051 master-0 kubenswrapper[4048]: I0308 03:10:57.858303 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:57.859051 master-0 kubenswrapper[4048]: E0308 03:10:57.859005 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:10:58.178906 master-0 kubenswrapper[4048]: I0308 03:10:58.178716 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="db3a63a925785d6eff81f565afee5497f9a99d04d1c84187c3150ffb13b3defd" exitCode=0 Mar 08 03:10:58.179205 master-0 kubenswrapper[4048]: I0308 03:10:58.178812 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"db3a63a925785d6eff81f565afee5497f9a99d04d1c84187c3150ffb13b3defd"} Mar 08 03:10:58.229777 master-0 kubenswrapper[4048]: I0308 03:10:58.229654 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:58.230865 master-0 kubenswrapper[4048]: E0308 03:10:58.229918 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:10:58.230865 master-0 kubenswrapper[4048]: E0308 03:10:58.229947 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:10:58.230865 master-0 kubenswrapper[4048]: E0308 03:10:58.229963 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:58.230865 master-0 kubenswrapper[4048]: E0308 03:10:58.230026 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:00.230003788 +0000 UTC m=+99.195476369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:10:59.858603 master-0 kubenswrapper[4048]: I0308 03:10:59.858542 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:10:59.859157 master-0 kubenswrapper[4048]: I0308 03:10:59.858629 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:10:59.859157 master-0 kubenswrapper[4048]: E0308 03:10:59.858687 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:10:59.859157 master-0 kubenswrapper[4048]: E0308 03:10:59.858785 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:00.070351 master-0 kubenswrapper[4048]: I0308 03:11:00.070275 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-xjg74"] Mar 08 03:11:00.070811 master-0 kubenswrapper[4048]: I0308 03:11:00.070771 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.072846 master-0 kubenswrapper[4048]: I0308 03:11:00.072807 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:11:00.073001 master-0 kubenswrapper[4048]: I0308 03:11:00.072976 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:11:00.073191 master-0 kubenswrapper[4048]: I0308 03:11:00.073141 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:11:00.073959 master-0 kubenswrapper[4048]: I0308 03:11:00.073921 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:11:00.074136 master-0 kubenswrapper[4048]: I0308 03:11:00.073968 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:11:00.189189 master-0 kubenswrapper[4048]: I0308 03:11:00.189084 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="bf04118919c009f59ea3e84f16c295d8440cef4db850135663e4a2db1d87ef48" exitCode=0 Mar 08 03:11:00.189189 master-0 kubenswrapper[4048]: I0308 03:11:00.189123 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"bf04118919c009f59ea3e84f16c295d8440cef4db850135663e4a2db1d87ef48"} Mar 08 03:11:00.249575 master-0 kubenswrapper[4048]: I0308 03:11:00.249460 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:00.249747 master-0 kubenswrapper[4048]: I0308 03:11:00.249629 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.249747 master-0 kubenswrapper[4048]: I0308 03:11:00.249650 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.249747 master-0 kubenswrapper[4048]: E0308 03:11:00.249684 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:11:00.249747 master-0 kubenswrapper[4048]: E0308 03:11:00.249712 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:11:00.249747 master-0 kubenswrapper[4048]: E0308 03:11:00.249725 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:00.249930 master-0 kubenswrapper[4048]: E0308 03:11:00.249770 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:04.249752048 +0000 UTC m=+103.215224619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:00.249930 master-0 kubenswrapper[4048]: I0308 03:11:00.249695 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.249930 master-0 kubenswrapper[4048]: I0308 03:11:00.249822 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.350407 master-0 kubenswrapper[4048]: I0308 03:11:00.350295 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.350407 master-0 kubenswrapper[4048]: I0308 03:11:00.350344 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.350697 master-0 kubenswrapper[4048]: I0308 03:11:00.350620 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.350697 master-0 kubenswrapper[4048]: I0308 03:11:00.350651 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.351266 master-0 kubenswrapper[4048]: I0308 03:11:00.351156 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.351266 master-0 kubenswrapper[4048]: I0308 03:11:00.351215 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.356310 master-0 kubenswrapper[4048]: I0308 03:11:00.356275 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.842512 master-0 kubenswrapper[4048]: I0308 03:11:00.842451 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.980528 master-0 kubenswrapper[4048]: I0308 03:11:00.980443 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:00.997196 master-0 kubenswrapper[4048]: W0308 03:11:00.997144 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ceb611_22e9_4a5e_b965_f4a6e2bfd3d6.slice/crio-823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939 WatchSource:0}: Error finding container 823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939: Status 404 returned error can't find the container with id 823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939 Mar 08 03:11:01.197460 master-0 kubenswrapper[4048]: I0308 03:11:01.197288 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerStarted","Data":"823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939"} Mar 08 03:11:01.863939 master-0 kubenswrapper[4048]: I0308 03:11:01.863886 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:01.864167 master-0 kubenswrapper[4048]: E0308 03:11:01.864028 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:01.864713 master-0 kubenswrapper[4048]: I0308 03:11:01.864598 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:01.864818 master-0 kubenswrapper[4048]: E0308 03:11:01.864705 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:02.569351 master-0 kubenswrapper[4048]: I0308 03:11:02.569257 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:02.571881 master-0 kubenswrapper[4048]: E0308 03:11:02.569450 4048 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:02.571881 master-0 kubenswrapper[4048]: E0308 03:11:02.569532 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:06.569512707 +0000 UTC m=+165.534985288 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:03.860884 master-0 kubenswrapper[4048]: I0308 03:11:03.860839 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:03.861528 master-0 kubenswrapper[4048]: E0308 03:11:03.860950 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:03.861528 master-0 kubenswrapper[4048]: I0308 03:11:03.861281 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:03.861528 master-0 kubenswrapper[4048]: E0308 03:11:03.861329 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:04.285276 master-0 kubenswrapper[4048]: I0308 03:11:04.285116 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:04.285276 master-0 kubenswrapper[4048]: E0308 03:11:04.285260 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:11:04.285276 master-0 kubenswrapper[4048]: E0308 03:11:04.285276 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:11:04.285276 master-0 kubenswrapper[4048]: E0308 03:11:04.285286 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:04.285646 master-0 kubenswrapper[4048]: E0308 03:11:04.285328 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:12.285315858 +0000 UTC m=+111.250788429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:05.859030 master-0 kubenswrapper[4048]: I0308 03:11:05.858934 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:05.859591 master-0 kubenswrapper[4048]: I0308 03:11:05.859069 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:05.859591 master-0 kubenswrapper[4048]: E0308 03:11:05.859235 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:05.859591 master-0 kubenswrapper[4048]: E0308 03:11:05.859386 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:07.859076 master-0 kubenswrapper[4048]: I0308 03:11:07.859023 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:07.860055 master-0 kubenswrapper[4048]: I0308 03:11:07.859111 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:07.860055 master-0 kubenswrapper[4048]: E0308 03:11:07.859207 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:07.860055 master-0 kubenswrapper[4048]: E0308 03:11:07.859300 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:09.859757 master-0 kubenswrapper[4048]: I0308 03:11:09.859691 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:09.859757 master-0 kubenswrapper[4048]: I0308 03:11:09.859721 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:09.860273 master-0 kubenswrapper[4048]: E0308 03:11:09.859868 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:09.860273 master-0 kubenswrapper[4048]: E0308 03:11:09.860021 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:11.859612 master-0 kubenswrapper[4048]: I0308 03:11:11.859565 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:11.860165 master-0 kubenswrapper[4048]: E0308 03:11:11.859653 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:11.860165 master-0 kubenswrapper[4048]: I0308 03:11:11.859989 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:11.860165 master-0 kubenswrapper[4048]: E0308 03:11:11.860043 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:12.360466 master-0 kubenswrapper[4048]: I0308 03:11:12.360412 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:12.360695 master-0 kubenswrapper[4048]: E0308 03:11:12.360614 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:11:12.360695 master-0 kubenswrapper[4048]: E0308 03:11:12.360638 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:11:12.360695 master-0 kubenswrapper[4048]: E0308 03:11:12.360654 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:12.360782 master-0 kubenswrapper[4048]: E0308 03:11:12.360701 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:28.360688237 +0000 UTC m=+127.326160808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:13.672219 master-0 kubenswrapper[4048]: I0308 03:11:13.672176 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:13.672584 master-0 kubenswrapper[4048]: E0308 03:11:13.672317 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:11:13.672584 master-0 kubenswrapper[4048]: E0308 03:11:13.672368 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.672353502 +0000 UTC m=+144.637826083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 03:11:13.859556 master-0 kubenswrapper[4048]: I0308 03:11:13.859200 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:13.859743 master-0 kubenswrapper[4048]: I0308 03:11:13.859294 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:13.859743 master-0 kubenswrapper[4048]: E0308 03:11:13.859607 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:13.859834 master-0 kubenswrapper[4048]: E0308 03:11:13.859764 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:13.875770 master-0 kubenswrapper[4048]: I0308 03:11:13.875702 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:11:14.234883 master-0 kubenswrapper[4048]: I0308 03:11:14.234708 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerStarted","Data":"0f90c7e80ee619a77867feffa666b20dfa8fad2e9ecc5d700b999460ff6d737b"} Mar 08 03:11:14.234883 master-0 kubenswrapper[4048]: I0308 03:11:14.234776 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerStarted","Data":"d75a484300177da4d4f6d08779de6037e0b48914c49438b940a4a56b62ec748f"} Mar 08 03:11:14.239536 master-0 kubenswrapper[4048]: I0308 03:11:14.239461 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="b3ea93aa98c6a855a072d3642fcdd00f5f7951231e2c2010a477ac7e3afcf009" exitCode=0 Mar 08 03:11:14.239754 master-0 kubenswrapper[4048]: I0308 03:11:14.239688 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"b3ea93aa98c6a855a072d3642fcdd00f5f7951231e2c2010a477ac7e3afcf009"} Mar 08 03:11:14.247478 master-0 kubenswrapper[4048]: I0308 03:11:14.247420 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" event={"ID":"d358134e-5625-492c-b4f7-460798631270","Type":"ContainerStarted","Data":"723615f545a9b912d96e2b20f5beb286b3ce93e38e0a010ef0152a7b0e0c1b1e"} Mar 08 03:11:14.250654 master-0 kubenswrapper[4048]: I0308 03:11:14.250599 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" exitCode=0 Mar 08 03:11:14.251381 master-0 kubenswrapper[4048]: I0308 03:11:14.250686 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} Mar 08 03:11:14.266811 master-0 kubenswrapper[4048]: I0308 03:11:14.263729 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.263692909 podStartE2EDuration="1.263692909s" podCreationTimestamp="2026-03-08 03:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:11:14.261965691 +0000 UTC m=+113.227438332" watchObservedRunningTime="2026-03-08 03:11:14.263692909 +0000 UTC m=+113.229165530" Mar 08 03:11:14.289519 master-0 kubenswrapper[4048]: I0308 03:11:14.289315 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-xjg74" podStartSLOduration=2.851138634 podStartE2EDuration="15.288286233s" podCreationTimestamp="2026-03-08 03:10:59 +0000 UTC" firstStartedPulling="2026-03-08 03:11:01.000282185 +0000 UTC m=+99.965754756" lastFinishedPulling="2026-03-08 03:11:13.437429744 +0000 UTC m=+112.402902355" observedRunningTime="2026-03-08 03:11:14.285973139 +0000 UTC m=+113.251445750" watchObservedRunningTime="2026-03-08 03:11:14.288286233 +0000 UTC m=+113.253758874" Mar 08 03:11:14.306902 master-0 kubenswrapper[4048]: I0308 03:11:14.305801 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" podStartSLOduration=2.586551701 podStartE2EDuration="21.30577586s" podCreationTimestamp="2026-03-08 03:10:53 +0000 UTC" firstStartedPulling="2026-03-08 03:10:54.721423744 +0000 UTC m=+93.686896315" lastFinishedPulling="2026-03-08 03:11:13.440647873 +0000 UTC m=+112.406120474" observedRunningTime="2026-03-08 03:11:14.304275768 +0000 UTC m=+113.269748379" watchObservedRunningTime="2026-03-08 03:11:14.30577586 +0000 UTC m=+113.271248471" Mar 08 03:11:15.259776 master-0 kubenswrapper[4048]: I0308 03:11:15.259227 4048 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="c1d3c31c196416ae00334f18b3e579542658be979ab39e41ffb430f787c5ee3e" exitCode=0 Mar 08 03:11:15.261541 master-0 kubenswrapper[4048]: I0308 03:11:15.259347 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerDied","Data":"c1d3c31c196416ae00334f18b3e579542658be979ab39e41ffb430f787c5ee3e"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269632 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269832 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269860 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269888 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269912 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:15.269975 master-0 kubenswrapper[4048]: I0308 03:11:15.269936 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:15.858408 master-0 kubenswrapper[4048]: I0308 03:11:15.858352 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:15.858710 master-0 kubenswrapper[4048]: I0308 03:11:15.858464 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:15.858710 master-0 kubenswrapper[4048]: E0308 03:11:15.858523 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:15.858852 master-0 kubenswrapper[4048]: E0308 03:11:15.858702 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:16.278772 master-0 kubenswrapper[4048]: I0308 03:11:16.278568 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" event={"ID":"76ceb013-e999-4f15-bf25-f8dcd2647f9f","Type":"ContainerStarted","Data":"3eabc2891e8b160b4c27953f87671e6abc89b8931184f7b94dde3b5a372db602"} Mar 08 03:11:16.309618 master-0 kubenswrapper[4048]: I0308 03:11:16.306119 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5qjn5" podStartSLOduration=3.658842469 podStartE2EDuration="35.306089878s" podCreationTimestamp="2026-03-08 03:10:41 +0000 UTC" firstStartedPulling="2026-03-08 03:10:41.692667481 +0000 UTC m=+80.658140082" lastFinishedPulling="2026-03-08 03:11:13.33991489 +0000 UTC m=+112.305387491" observedRunningTime="2026-03-08 03:11:16.306056437 +0000 UTC m=+115.271529038" watchObservedRunningTime="2026-03-08 03:11:16.306089878 +0000 UTC m=+115.271562479" Mar 08 03:11:17.287035 master-0 kubenswrapper[4048]: I0308 03:11:17.286940 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} Mar 08 03:11:17.858539 master-0 kubenswrapper[4048]: I0308 03:11:17.858328 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:17.858539 master-0 kubenswrapper[4048]: I0308 03:11:17.858414 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:17.858909 master-0 kubenswrapper[4048]: E0308 03:11:17.858589 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:17.859419 master-0 kubenswrapper[4048]: E0308 03:11:17.859345 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:19.495031 master-0 kubenswrapper[4048]: I0308 03:11:19.494688 4048 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mxjn"] Mar 08 03:11:19.858570 master-0 kubenswrapper[4048]: I0308 03:11:19.858459 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:19.858809 master-0 kubenswrapper[4048]: I0308 03:11:19.858563 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:19.858809 master-0 kubenswrapper[4048]: E0308 03:11:19.858689 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:19.858954 master-0 kubenswrapper[4048]: E0308 03:11:19.858797 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:20.302905 master-0 kubenswrapper[4048]: I0308 03:11:20.302718 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerStarted","Data":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} Mar 08 03:11:20.303309 master-0 kubenswrapper[4048]: I0308 03:11:20.303265 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:20.303438 master-0 kubenswrapper[4048]: I0308 03:11:20.303324 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:20.336968 master-0 kubenswrapper[4048]: I0308 03:11:20.336827 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podStartSLOduration=8.411727517 podStartE2EDuration="27.336800865s" podCreationTimestamp="2026-03-08 03:10:53 +0000 UTC" firstStartedPulling="2026-03-08 03:10:54.515719649 +0000 UTC m=+93.481192250" lastFinishedPulling="2026-03-08 03:11:13.440792997 +0000 UTC m=+112.406265598" observedRunningTime="2026-03-08 03:11:20.336072695 +0000 UTC m=+119.301545266" watchObservedRunningTime="2026-03-08 03:11:20.336800865 +0000 UTC m=+119.302273466" Mar 08 03:11:20.341373 master-0 kubenswrapper[4048]: I0308 03:11:20.341297 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:21.306380 master-0 kubenswrapper[4048]: I0308 03:11:21.306282 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-controller" containerID="cri-o://b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306311 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="nbdb" containerID="cri-o://a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306421 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306566 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-node" containerID="cri-o://b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306553 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="sbdb" containerID="cri-o://5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306595 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306431 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-acl-logging" containerID="cri-o://db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" gracePeriod=30 Mar 08 03:11:21.307386 master-0 kubenswrapper[4048]: I0308 03:11:21.306667 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="northd" containerID="cri-o://8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" gracePeriod=30 Mar 08 03:11:21.311966 master-0 kubenswrapper[4048]: E0308 03:11:21.311864 4048 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 08 03:11:21.322094 master-0 kubenswrapper[4048]: E0308 03:11:21.321852 4048 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 08 03:11:21.327448 master-0 kubenswrapper[4048]: E0308 03:11:21.326732 4048 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 08 03:11:21.327448 master-0 kubenswrapper[4048]: E0308 03:11:21.326834 4048 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="nbdb" Mar 08 03:11:21.355621 master-0 kubenswrapper[4048]: I0308 03:11:21.354886 4048 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovnkube-controller" containerID="cri-o://9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" gracePeriod=30 Mar 08 03:11:21.738385 master-0 kubenswrapper[4048]: E0308 03:11:21.738322 4048 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 03:11:21.858711 master-0 kubenswrapper[4048]: I0308 03:11:21.858281 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:21.858711 master-0 kubenswrapper[4048]: I0308 03:11:21.858334 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:21.859619 master-0 kubenswrapper[4048]: E0308 03:11:21.859559 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:21.859834 master-0 kubenswrapper[4048]: E0308 03:11:21.859756 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:21.869872 master-0 kubenswrapper[4048]: E0308 03:11:21.869773 4048 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 03:11:22.193087 master-0 kubenswrapper[4048]: I0308 03:11:22.193018 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovnkube-controller/0.log" Mar 08 03:11:22.195973 master-0 kubenswrapper[4048]: I0308 03:11:22.195916 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 03:11:22.196874 master-0 kubenswrapper[4048]: I0308 03:11:22.196827 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/kube-rbac-proxy-node/0.log" Mar 08 03:11:22.197786 master-0 kubenswrapper[4048]: I0308 03:11:22.197732 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovn-acl-logging/0.log" Mar 08 03:11:22.198766 master-0 kubenswrapper[4048]: I0308 03:11:22.198712 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovn-controller/0.log" Mar 08 03:11:22.199543 master-0 kubenswrapper[4048]: I0308 03:11:22.199515 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:22.264305 master-0 kubenswrapper[4048]: I0308 03:11:22.264219 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krdvz"] Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: E0308 03:11:22.264391 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="sbdb" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: I0308 03:11:22.264413 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="sbdb" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: E0308 03:11:22.264430 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="nbdb" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: I0308 03:11:22.264442 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="nbdb" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: E0308 03:11:22.264454 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-controller" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: I0308 03:11:22.264467 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-controller" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: E0308 03:11:22.264480 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:11:22.264598 master-0 kubenswrapper[4048]: I0308 03:11:22.264577 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: E0308 03:11:22.264663 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-acl-logging" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264679 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-acl-logging" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: E0308 03:11:22.264726 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovnkube-controller" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264741 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovnkube-controller" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: E0308 03:11:22.264756 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-node" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264768 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-node" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: E0308 03:11:22.264781 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="northd" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264794 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="northd" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: E0308 03:11:22.264807 4048 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kubecfg-setup" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264821 4048 state_mem.go:107] "Deleted CPUSet assignment" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kubecfg-setup" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264938 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-node" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264956 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264969 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="northd" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264982 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="nbdb" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.264995 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="sbdb" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.265008 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovnkube-controller" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.265020 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-acl-logging" Mar 08 03:11:22.265058 master-0 kubenswrapper[4048]: I0308 03:11:22.265032 4048 memory_manager.go:354] "RemoveStaleState removing state" podUID="6612131e-f8b4-43cb-9031-251ac924de96" containerName="ovn-controller" Mar 08 03:11:22.266394 master-0 kubenswrapper[4048]: I0308 03:11:22.266343 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.270911 master-0 kubenswrapper[4048]: I0308 03:11:22.270709 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.270911 master-0 kubenswrapper[4048]: I0308 03:11:22.270772 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.270911 master-0 kubenswrapper[4048]: I0308 03:11:22.270826 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.270911 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.270982 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271034 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271081 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271125 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271165 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271208 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271252 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271275 master-0 kubenswrapper[4048]: I0308 03:11:22.271263 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket" (OuterVolumeSpecName: "log-socket") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271312 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271415 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271559 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271618 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271662 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271412 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271707 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271825 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271884 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271923 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.271995 master-0 kubenswrapper[4048]: I0308 03:11:22.271933 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272242 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272330 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272458 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272530 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272528 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272554 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272640 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272703 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272732 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272793 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.272815 master-0 kubenswrapper[4048]: I0308 03:11:22.272825 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.272885 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.272911 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.272826 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.273042 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.273072 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.273123 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.273636 master-0 kubenswrapper[4048]: I0308 03:11:22.273191 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274314 4048 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274374 4048 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274400 4048 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274428 4048 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274520 4048 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274551 4048 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274576 4048 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274600 4048 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274624 4048 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6612131e-f8b4-43cb-9031-251ac924de96-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.274692 master-0 kubenswrapper[4048]: I0308 03:11:22.274649 4048 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.275603 master-0 kubenswrapper[4048]: I0308 03:11:22.275549 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.312017 master-0 kubenswrapper[4048]: I0308 03:11:22.311939 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovnkube-controller/0.log" Mar 08 03:11:22.314735 master-0 kubenswrapper[4048]: I0308 03:11:22.314683 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/kube-rbac-proxy-ovn-metrics/0.log" Mar 08 03:11:22.315356 master-0 kubenswrapper[4048]: I0308 03:11:22.315308 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/kube-rbac-proxy-node/0.log" Mar 08 03:11:22.315991 master-0 kubenswrapper[4048]: I0308 03:11:22.315916 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovn-acl-logging/0.log" Mar 08 03:11:22.317133 master-0 kubenswrapper[4048]: I0308 03:11:22.316993 4048 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9mxjn_6612131e-f8b4-43cb-9031-251ac924de96/ovn-controller/0.log" Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317640 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" exitCode=2 Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317682 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" exitCode=0 Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317703 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" exitCode=0 Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317720 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" exitCode=0 Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317721 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} Mar 08 03:11:22.317772 master-0 kubenswrapper[4048]: I0308 03:11:22.317770 4048 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317786 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317811 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317830 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317851 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317878 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.317737 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" exitCode=143 Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318006 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" exitCode=143 Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318030 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" exitCode=143 Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318046 4048 generic.go:334] "Generic (PLEG): container finished" podID="6612131e-f8b4-43cb-9031-251ac924de96" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" exitCode=143 Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318069 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318090 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:22.318198 master-0 kubenswrapper[4048]: I0308 03:11:22.318210 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318223 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318243 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318264 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318281 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318296 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318311 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318325 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318339 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318352 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318362 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318372 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318388 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318406 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318418 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318429 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318440 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318451 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318463 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318473 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318518 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318533 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318547 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9mxjn" event={"ID":"6612131e-f8b4-43cb-9031-251ac924de96","Type":"ContainerDied","Data":"d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318563 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318575 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318586 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318596 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318607 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} Mar 08 03:11:22.319027 master-0 kubenswrapper[4048]: I0308 03:11:22.318618 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} Mar 08 03:11:22.320806 master-0 kubenswrapper[4048]: I0308 03:11:22.318666 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} Mar 08 03:11:22.320806 master-0 kubenswrapper[4048]: I0308 03:11:22.318682 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} Mar 08 03:11:22.320806 master-0 kubenswrapper[4048]: I0308 03:11:22.318696 4048 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} Mar 08 03:11:22.341833 master-0 kubenswrapper[4048]: I0308 03:11:22.341785 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.359730 master-0 kubenswrapper[4048]: I0308 03:11:22.359685 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.371936 master-0 kubenswrapper[4048]: I0308 03:11:22.371886 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.375830 master-0 kubenswrapper[4048]: I0308 03:11:22.375778 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.375968 master-0 kubenswrapper[4048]: I0308 03:11:22.375832 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.375968 master-0 kubenswrapper[4048]: I0308 03:11:22.375870 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.375968 master-0 kubenswrapper[4048]: I0308 03:11:22.375903 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.375968 master-0 kubenswrapper[4048]: I0308 03:11:22.375938 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.375968 master-0 kubenswrapper[4048]: I0308 03:11:22.375969 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.376013 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.376043 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.375899 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log" (OuterVolumeSpecName: "node-log") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.375929 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash" (OuterVolumeSpecName: "host-slash") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.375956 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.376090 4048 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fjkw\" (UniqueName: \"kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw\") pod \"6612131e-f8b4-43cb-9031-251ac924de96\" (UID: \"6612131e-f8b4-43cb-9031-251ac924de96\") " Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.376220 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.376068 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376248 master-0 kubenswrapper[4048]: I0308 03:11:22.375981 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376140 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376304 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376325 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376642 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376704 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376755 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.376777 master-0 kubenswrapper[4048]: I0308 03:11:22.376787 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.376944 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.376991 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377036 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.376979 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377114 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377136 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377156 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377174 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377191 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377196 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.377262 master-0 kubenswrapper[4048]: I0308 03:11:22.377269 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377286 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377313 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377335 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377342 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377366 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377411 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377434 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377452 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377475 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377413 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377522 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377565 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377592 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377627 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377687 4048 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377680 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377738 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.378044 master-0 kubenswrapper[4048]: I0308 03:11:22.377707 4048 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-node-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377777 4048 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377793 4048 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377807 4048 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377828 4048 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377841 4048 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377799 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.377860 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.378326 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.379277 master-0 kubenswrapper[4048]: I0308 03:11:22.378411 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.383748 master-0 kubenswrapper[4048]: I0308 03:11:22.383669 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw" (OuterVolumeSpecName: "kube-api-access-9fjkw") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "kube-api-access-9fjkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:11:22.384378 master-0 kubenswrapper[4048]: I0308 03:11:22.384330 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.384747 master-0 kubenswrapper[4048]: I0308 03:11:22.384676 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:11:22.387996 master-0 kubenswrapper[4048]: I0308 03:11:22.387932 4048 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6612131e-f8b4-43cb-9031-251ac924de96" (UID: "6612131e-f8b4-43cb-9031-251ac924de96"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:11:22.405683 master-0 kubenswrapper[4048]: I0308 03:11:22.405620 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.406425 master-0 kubenswrapper[4048]: I0308 03:11:22.406368 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.418532 master-0 kubenswrapper[4048]: I0308 03:11:22.418452 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.432106 master-0 kubenswrapper[4048]: I0308 03:11:22.432053 4048 scope.go:117] "RemoveContainer" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.442779 master-0 kubenswrapper[4048]: I0308 03:11:22.442731 4048 scope.go:117] "RemoveContainer" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.457787 master-0 kubenswrapper[4048]: I0308 03:11:22.457739 4048 scope.go:117] "RemoveContainer" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.469071 master-0 kubenswrapper[4048]: I0308 03:11:22.469028 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.469571 master-0 kubenswrapper[4048]: E0308 03:11:22.469522 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.469700 master-0 kubenswrapper[4048]: I0308 03:11:22.469626 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} err="failed to get container status \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" Mar 08 03:11:22.469700 master-0 kubenswrapper[4048]: I0308 03:11:22.469683 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.470059 master-0 kubenswrapper[4048]: E0308 03:11:22.469992 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.470164 master-0 kubenswrapper[4048]: I0308 03:11:22.470051 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} err="failed to get container status \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" Mar 08 03:11:22.470164 master-0 kubenswrapper[4048]: I0308 03:11:22.470091 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.470617 master-0 kubenswrapper[4048]: E0308 03:11:22.470569 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.470617 master-0 kubenswrapper[4048]: I0308 03:11:22.470603 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} err="failed to get container status \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" Mar 08 03:11:22.470808 master-0 kubenswrapper[4048]: I0308 03:11:22.470625 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.471114 master-0 kubenswrapper[4048]: E0308 03:11:22.471057 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.471114 master-0 kubenswrapper[4048]: I0308 03:11:22.471096 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} err="failed to get container status \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" Mar 08 03:11:22.471316 master-0 kubenswrapper[4048]: I0308 03:11:22.471123 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.471540 master-0 kubenswrapper[4048]: E0308 03:11:22.471475 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.471649 master-0 kubenswrapper[4048]: I0308 03:11:22.471553 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} err="failed to get container status \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" Mar 08 03:11:22.471649 master-0 kubenswrapper[4048]: I0308 03:11:22.471598 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.472030 master-0 kubenswrapper[4048]: E0308 03:11:22.471978 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.472030 master-0 kubenswrapper[4048]: I0308 03:11:22.472013 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} err="failed to get container status \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" Mar 08 03:11:22.472030 master-0 kubenswrapper[4048]: I0308 03:11:22.472035 4048 scope.go:117] "RemoveContainer" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.472672 master-0 kubenswrapper[4048]: E0308 03:11:22.472635 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": container with ID starting with db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0 not found: ID does not exist" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.472769 master-0 kubenswrapper[4048]: I0308 03:11:22.472669 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} err="failed to get container status \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": rpc error: code = NotFound desc = could not find container \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": container with ID starting with db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0 not found: ID does not exist" Mar 08 03:11:22.472769 master-0 kubenswrapper[4048]: I0308 03:11:22.472691 4048 scope.go:117] "RemoveContainer" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.473024 master-0 kubenswrapper[4048]: E0308 03:11:22.472983 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": container with ID starting with b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4 not found: ID does not exist" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.473119 master-0 kubenswrapper[4048]: I0308 03:11:22.473027 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} err="failed to get container status \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": rpc error: code = NotFound desc = could not find container \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": container with ID starting with b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4 not found: ID does not exist" Mar 08 03:11:22.473119 master-0 kubenswrapper[4048]: I0308 03:11:22.473054 4048 scope.go:117] "RemoveContainer" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.473423 master-0 kubenswrapper[4048]: E0308 03:11:22.473389 4048 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": container with ID starting with c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492 not found: ID does not exist" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.473423 master-0 kubenswrapper[4048]: I0308 03:11:22.473418 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} err="failed to get container status \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": rpc error: code = NotFound desc = could not find container \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": container with ID starting with c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492 not found: ID does not exist" Mar 08 03:11:22.473644 master-0 kubenswrapper[4048]: I0308 03:11:22.473436 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.473743 master-0 kubenswrapper[4048]: I0308 03:11:22.473706 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} err="failed to get container status \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" Mar 08 03:11:22.473827 master-0 kubenswrapper[4048]: I0308 03:11:22.473741 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.474090 master-0 kubenswrapper[4048]: I0308 03:11:22.474055 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} err="failed to get container status \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" Mar 08 03:11:22.474090 master-0 kubenswrapper[4048]: I0308 03:11:22.474080 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.474377 master-0 kubenswrapper[4048]: I0308 03:11:22.474341 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} err="failed to get container status \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" Mar 08 03:11:22.474377 master-0 kubenswrapper[4048]: I0308 03:11:22.474366 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.474669 master-0 kubenswrapper[4048]: I0308 03:11:22.474631 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} err="failed to get container status \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" Mar 08 03:11:22.474669 master-0 kubenswrapper[4048]: I0308 03:11:22.474658 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.474941 master-0 kubenswrapper[4048]: I0308 03:11:22.474906 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} err="failed to get container status \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" Mar 08 03:11:22.474941 master-0 kubenswrapper[4048]: I0308 03:11:22.474931 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.475280 master-0 kubenswrapper[4048]: I0308 03:11:22.475240 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} err="failed to get container status \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" Mar 08 03:11:22.475280 master-0 kubenswrapper[4048]: I0308 03:11:22.475272 4048 scope.go:117] "RemoveContainer" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.475647 master-0 kubenswrapper[4048]: I0308 03:11:22.475571 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} err="failed to get container status \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": rpc error: code = NotFound desc = could not find container \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": container with ID starting with db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0 not found: ID does not exist" Mar 08 03:11:22.475647 master-0 kubenswrapper[4048]: I0308 03:11:22.475591 4048 scope.go:117] "RemoveContainer" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.476154 master-0 kubenswrapper[4048]: I0308 03:11:22.476056 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} err="failed to get container status \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": rpc error: code = NotFound desc = could not find container \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": container with ID starting with b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4 not found: ID does not exist" Mar 08 03:11:22.476154 master-0 kubenswrapper[4048]: I0308 03:11:22.476136 4048 scope.go:117] "RemoveContainer" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.476642 master-0 kubenswrapper[4048]: I0308 03:11:22.476561 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} err="failed to get container status \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": rpc error: code = NotFound desc = could not find container \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": container with ID starting with c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492 not found: ID does not exist" Mar 08 03:11:22.476642 master-0 kubenswrapper[4048]: I0308 03:11:22.476630 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.476994 master-0 kubenswrapper[4048]: I0308 03:11:22.476942 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} err="failed to get container status \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" Mar 08 03:11:22.476994 master-0 kubenswrapper[4048]: I0308 03:11:22.476969 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.477301 master-0 kubenswrapper[4048]: I0308 03:11:22.477249 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} err="failed to get container status \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" Mar 08 03:11:22.477301 master-0 kubenswrapper[4048]: I0308 03:11:22.477278 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.477595 master-0 kubenswrapper[4048]: I0308 03:11:22.477560 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} err="failed to get container status \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" Mar 08 03:11:22.477595 master-0 kubenswrapper[4048]: I0308 03:11:22.477588 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478122 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} err="failed to get container status \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478164 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478417 4048 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6612131e-f8b4-43cb-9031-251ac924de96-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478437 4048 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fjkw\" (UniqueName: \"kubernetes.io/projected/6612131e-f8b4-43cb-9031-251ac924de96-kube-api-access-9fjkw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478451 4048 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6612131e-f8b4-43cb-9031-251ac924de96-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478457 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} err="failed to get container status \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" Mar 08 03:11:22.478470 master-0 kubenswrapper[4048]: I0308 03:11:22.478507 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.479046 master-0 kubenswrapper[4048]: I0308 03:11:22.478815 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} err="failed to get container status \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" Mar 08 03:11:22.479046 master-0 kubenswrapper[4048]: I0308 03:11:22.478850 4048 scope.go:117] "RemoveContainer" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.479293 master-0 kubenswrapper[4048]: I0308 03:11:22.479202 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} err="failed to get container status \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": rpc error: code = NotFound desc = could not find container \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": container with ID starting with db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0 not found: ID does not exist" Mar 08 03:11:22.479293 master-0 kubenswrapper[4048]: I0308 03:11:22.479268 4048 scope.go:117] "RemoveContainer" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.479641 master-0 kubenswrapper[4048]: I0308 03:11:22.479587 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} err="failed to get container status \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": rpc error: code = NotFound desc = could not find container \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": container with ID starting with b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4 not found: ID does not exist" Mar 08 03:11:22.479641 master-0 kubenswrapper[4048]: I0308 03:11:22.479626 4048 scope.go:117] "RemoveContainer" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.480046 master-0 kubenswrapper[4048]: I0308 03:11:22.479989 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} err="failed to get container status \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": rpc error: code = NotFound desc = could not find container \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": container with ID starting with c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492 not found: ID does not exist" Mar 08 03:11:22.480046 master-0 kubenswrapper[4048]: I0308 03:11:22.480022 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.480538 master-0 kubenswrapper[4048]: I0308 03:11:22.480471 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} err="failed to get container status \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" Mar 08 03:11:22.480538 master-0 kubenswrapper[4048]: I0308 03:11:22.480521 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.480871 master-0 kubenswrapper[4048]: I0308 03:11:22.480813 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} err="failed to get container status \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" Mar 08 03:11:22.480871 master-0 kubenswrapper[4048]: I0308 03:11:22.480847 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.481246 master-0 kubenswrapper[4048]: I0308 03:11:22.481192 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} err="failed to get container status \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" Mar 08 03:11:22.481246 master-0 kubenswrapper[4048]: I0308 03:11:22.481224 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.481605 master-0 kubenswrapper[4048]: I0308 03:11:22.481553 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} err="failed to get container status \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" Mar 08 03:11:22.481605 master-0 kubenswrapper[4048]: I0308 03:11:22.481584 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.481929 master-0 kubenswrapper[4048]: I0308 03:11:22.481876 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} err="failed to get container status \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" Mar 08 03:11:22.481929 master-0 kubenswrapper[4048]: I0308 03:11:22.481907 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.482213 master-0 kubenswrapper[4048]: I0308 03:11:22.482178 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} err="failed to get container status \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" Mar 08 03:11:22.482324 master-0 kubenswrapper[4048]: I0308 03:11:22.482206 4048 scope.go:117] "RemoveContainer" containerID="db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0" Mar 08 03:11:22.482667 master-0 kubenswrapper[4048]: I0308 03:11:22.482610 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0"} err="failed to get container status \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": rpc error: code = NotFound desc = could not find container \"db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0\": container with ID starting with db9cd3d07fecfd2b96d779e12aaa7165ed020da71b7baccad959163044804ce0 not found: ID does not exist" Mar 08 03:11:22.482667 master-0 kubenswrapper[4048]: I0308 03:11:22.482661 4048 scope.go:117] "RemoveContainer" containerID="b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4" Mar 08 03:11:22.483061 master-0 kubenswrapper[4048]: I0308 03:11:22.483007 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4"} err="failed to get container status \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": rpc error: code = NotFound desc = could not find container \"b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4\": container with ID starting with b0e92266ea8077ea03e6784cce2ef88cf7debbdc4e8cc7cde731095bc8b640d4 not found: ID does not exist" Mar 08 03:11:22.483061 master-0 kubenswrapper[4048]: I0308 03:11:22.483037 4048 scope.go:117] "RemoveContainer" containerID="c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492" Mar 08 03:11:22.483439 master-0 kubenswrapper[4048]: I0308 03:11:22.483384 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492"} err="failed to get container status \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": rpc error: code = NotFound desc = could not find container \"c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492\": container with ID starting with c5ee2a90ef0c8f47ba3515f4511f8915194993dad964ed5d3f54af217da70492 not found: ID does not exist" Mar 08 03:11:22.483439 master-0 kubenswrapper[4048]: I0308 03:11:22.483415 4048 scope.go:117] "RemoveContainer" containerID="9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4" Mar 08 03:11:22.483771 master-0 kubenswrapper[4048]: I0308 03:11:22.483711 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4"} err="failed to get container status \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": rpc error: code = NotFound desc = could not find container \"9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4\": container with ID starting with 9519e7fe5ac7290d9010a3ac9275cac8375cf4bbcd7eb1d5d7a2609a182ae3a4 not found: ID does not exist" Mar 08 03:11:22.483771 master-0 kubenswrapper[4048]: I0308 03:11:22.483743 4048 scope.go:117] "RemoveContainer" containerID="5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f" Mar 08 03:11:22.484241 master-0 kubenswrapper[4048]: I0308 03:11:22.484175 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f"} err="failed to get container status \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": rpc error: code = NotFound desc = could not find container \"5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f\": container with ID starting with 5e3062d2ddf9d140c8880b9dd5bdd322e3d096255295cd848f4eea7707f7b85f not found: ID does not exist" Mar 08 03:11:22.484241 master-0 kubenswrapper[4048]: I0308 03:11:22.484219 4048 scope.go:117] "RemoveContainer" containerID="a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d" Mar 08 03:11:22.484608 master-0 kubenswrapper[4048]: I0308 03:11:22.484560 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d"} err="failed to get container status \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": rpc error: code = NotFound desc = could not find container \"a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d\": container with ID starting with a65376904b5b720babc545e4703f5434f1f0b5dd15186b1245022bf20213fd4d not found: ID does not exist" Mar 08 03:11:22.484608 master-0 kubenswrapper[4048]: I0308 03:11:22.484590 4048 scope.go:117] "RemoveContainer" containerID="8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1" Mar 08 03:11:22.485090 master-0 kubenswrapper[4048]: I0308 03:11:22.484985 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1"} err="failed to get container status \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": rpc error: code = NotFound desc = could not find container \"8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1\": container with ID starting with 8e889fccc8db70386eddc9ed0c462c3066baf40d117fc162770b5a86aea11ec1 not found: ID does not exist" Mar 08 03:11:22.485090 master-0 kubenswrapper[4048]: I0308 03:11:22.485074 4048 scope.go:117] "RemoveContainer" containerID="0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf" Mar 08 03:11:22.485471 master-0 kubenswrapper[4048]: I0308 03:11:22.485411 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf"} err="failed to get container status \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": rpc error: code = NotFound desc = could not find container \"0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf\": container with ID starting with 0cce35e6d6fd45754844d5bddaa85f542f1266c00ea8f500a3ff412b966c7ecf not found: ID does not exist" Mar 08 03:11:22.485471 master-0 kubenswrapper[4048]: I0308 03:11:22.485445 4048 scope.go:117] "RemoveContainer" containerID="b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d" Mar 08 03:11:22.486094 master-0 kubenswrapper[4048]: I0308 03:11:22.486023 4048 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d"} err="failed to get container status \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": rpc error: code = NotFound desc = could not find container \"b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d\": container with ID starting with b760bf11f83c4247357bf260eda643e1b92139ce75a86d5ea3bf19c3aa8be79d not found: ID does not exist" Mar 08 03:11:22.588030 master-0 kubenswrapper[4048]: I0308 03:11:22.587936 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:22.606328 master-0 kubenswrapper[4048]: W0308 03:11:22.606260 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcceeebd6_19f6_4a3a_a1eb_4ee1174a8cbd.slice/crio-09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331 WatchSource:0}: Error finding container 09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331: Status 404 returned error can't find the container with id 09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331 Mar 08 03:11:22.678658 master-0 kubenswrapper[4048]: I0308 03:11:22.677952 4048 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mxjn"] Mar 08 03:11:22.679480 master-0 kubenswrapper[4048]: I0308 03:11:22.679416 4048 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9mxjn"] Mar 08 03:11:23.324615 master-0 kubenswrapper[4048]: I0308 03:11:23.324558 4048 generic.go:334] "Generic (PLEG): container finished" podID="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" containerID="a24cd319d12c3bab1bf2b10e5afaf7c8e507dee6da981a24060116593e6e64aa" exitCode=0 Mar 08 03:11:23.325701 master-0 kubenswrapper[4048]: I0308 03:11:23.324684 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerDied","Data":"a24cd319d12c3bab1bf2b10e5afaf7c8e507dee6da981a24060116593e6e64aa"} Mar 08 03:11:23.325701 master-0 kubenswrapper[4048]: I0308 03:11:23.324836 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331"} Mar 08 03:11:23.858666 master-0 kubenswrapper[4048]: I0308 03:11:23.858351 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:23.858815 master-0 kubenswrapper[4048]: E0308 03:11:23.858728 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:23.859078 master-0 kubenswrapper[4048]: I0308 03:11:23.858396 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:23.859236 master-0 kubenswrapper[4048]: E0308 03:11:23.859099 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:23.866876 master-0 kubenswrapper[4048]: I0308 03:11:23.866830 4048 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6612131e-f8b4-43cb-9031-251ac924de96" path="/var/lib/kubelet/pods/6612131e-f8b4-43cb-9031-251ac924de96/volumes" Mar 08 03:11:24.336729 master-0 kubenswrapper[4048]: I0308 03:11:24.336583 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"36a5dbd5dce45c27a09b16ccfd145d7cdb0578dd7709bfca0e2812fc062c2969"} Mar 08 03:11:24.338026 master-0 kubenswrapper[4048]: I0308 03:11:24.337696 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"3367c824f08350a854173781f16487550a44d1a6701b86aef5c68f4cd10641cc"} Mar 08 03:11:24.338026 master-0 kubenswrapper[4048]: I0308 03:11:24.337787 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"cebf0b6daaf1cc65213cd4f50caf254efa57cc3363950e78ab531992adf6d819"} Mar 08 03:11:24.338026 master-0 kubenswrapper[4048]: I0308 03:11:24.337817 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"c71eaab5aeb9cf9bb0dd73741259d69d1186faaf99bdf3920c2d28c53c312fa6"} Mar 08 03:11:24.338026 master-0 kubenswrapper[4048]: I0308 03:11:24.337848 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"d2169d2075cab315d481dff72e4a3279a81a19be647394d3efa307ac1373a618"} Mar 08 03:11:24.338026 master-0 kubenswrapper[4048]: I0308 03:11:24.337871 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"220f887d8b51a31013ddf50ea1bb2a90dc65efe0af01b006953646d609eeebb9"} Mar 08 03:11:25.858533 master-0 kubenswrapper[4048]: I0308 03:11:25.858387 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:25.859691 master-0 kubenswrapper[4048]: I0308 03:11:25.858430 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:25.859691 master-0 kubenswrapper[4048]: E0308 03:11:25.858601 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:25.859691 master-0 kubenswrapper[4048]: E0308 03:11:25.858733 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:26.354816 master-0 kubenswrapper[4048]: I0308 03:11:26.354743 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"c15109bbc9dc116f6c47d32797fcee51a841b26733272df7502a4b8708853e20"} Mar 08 03:11:26.871281 master-0 kubenswrapper[4048]: E0308 03:11:26.871207 4048 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 03:11:27.858621 master-0 kubenswrapper[4048]: I0308 03:11:27.858516 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:27.858927 master-0 kubenswrapper[4048]: E0308 03:11:27.858725 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:27.858927 master-0 kubenswrapper[4048]: I0308 03:11:27.858820 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:27.859090 master-0 kubenswrapper[4048]: E0308 03:11:27.858998 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:28.435369 master-0 kubenswrapper[4048]: I0308 03:11:28.435005 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:28.436086 master-0 kubenswrapper[4048]: E0308 03:11:28.435219 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 03:11:28.436086 master-0 kubenswrapper[4048]: E0308 03:11:28.435425 4048 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 03:11:28.436086 master-0 kubenswrapper[4048]: E0308 03:11:28.435444 4048 projected.go:194] Error preparing data for projected volume kube-api-access-b25w4 for pod openshift-network-diagnostics/network-check-target-l5x6h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:28.436086 master-0 kubenswrapper[4048]: E0308 03:11:28.435542 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4 podName:aa781f72-e72f-47e1-b37a-977340c182c8 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:00.435521655 +0000 UTC m=+159.400994236 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-b25w4" (UniqueName: "kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4") pod "network-check-target-l5x6h" (UID: "aa781f72-e72f-47e1-b37a-977340c182c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 03:11:29.375009 master-0 kubenswrapper[4048]: I0308 03:11:29.371391 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" event={"ID":"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd","Type":"ContainerStarted","Data":"9e146f0c4c072260e4f95a3e3d897233472b4bdafad03d6d159faaec3e49e5d3"} Mar 08 03:11:29.375009 master-0 kubenswrapper[4048]: I0308 03:11:29.371850 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:29.375009 master-0 kubenswrapper[4048]: I0308 03:11:29.372244 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:29.404529 master-0 kubenswrapper[4048]: I0308 03:11:29.403740 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jl9tj"] Mar 08 03:11:29.404529 master-0 kubenswrapper[4048]: I0308 03:11:29.403858 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:29.404529 master-0 kubenswrapper[4048]: E0308 03:11:29.403958 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:29.404741 master-0 kubenswrapper[4048]: I0308 03:11:29.404704 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l5x6h"] Mar 08 03:11:29.404855 master-0 kubenswrapper[4048]: I0308 03:11:29.404823 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:29.405001 master-0 kubenswrapper[4048]: E0308 03:11:29.404947 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:29.409997 master-0 kubenswrapper[4048]: I0308 03:11:29.409394 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:29.420100 master-0 kubenswrapper[4048]: I0308 03:11:29.419944 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" podStartSLOduration=7.41991905 podStartE2EDuration="7.41991905s" podCreationTimestamp="2026-03-08 03:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:11:29.416129685 +0000 UTC m=+128.381602306" watchObservedRunningTime="2026-03-08 03:11:29.41991905 +0000 UTC m=+128.385391661" Mar 08 03:11:30.375558 master-0 kubenswrapper[4048]: I0308 03:11:30.375381 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:30.409907 master-0 kubenswrapper[4048]: I0308 03:11:30.409807 4048 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:30.859130 master-0 kubenswrapper[4048]: I0308 03:11:30.859070 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:30.859130 master-0 kubenswrapper[4048]: I0308 03:11:30.859106 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:30.859657 master-0 kubenswrapper[4048]: E0308 03:11:30.859198 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:30.859657 master-0 kubenswrapper[4048]: E0308 03:11:30.859327 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:31.873268 master-0 kubenswrapper[4048]: E0308 03:11:31.872832 4048 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 03:11:32.858687 master-0 kubenswrapper[4048]: I0308 03:11:32.858620 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:32.858950 master-0 kubenswrapper[4048]: E0308 03:11:32.858784 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:32.859151 master-0 kubenswrapper[4048]: I0308 03:11:32.859103 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:32.859451 master-0 kubenswrapper[4048]: E0308 03:11:32.859412 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:34.859148 master-0 kubenswrapper[4048]: I0308 03:11:34.859063 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:34.860373 master-0 kubenswrapper[4048]: I0308 03:11:34.859180 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:34.860373 master-0 kubenswrapper[4048]: E0308 03:11:34.859256 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:34.860373 master-0 kubenswrapper[4048]: E0308 03:11:34.859391 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:36.858977 master-0 kubenswrapper[4048]: I0308 03:11:36.858818 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:36.860014 master-0 kubenswrapper[4048]: E0308 03:11:36.859016 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jl9tj" podUID="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" Mar 08 03:11:36.860014 master-0 kubenswrapper[4048]: I0308 03:11:36.858818 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:36.860014 master-0 kubenswrapper[4048]: E0308 03:11:36.859131 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l5x6h" podUID="aa781f72-e72f-47e1-b37a-977340c182c8" Mar 08 03:11:37.549192 master-0 kubenswrapper[4048]: I0308 03:11:37.549107 4048 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 08 03:11:37.667188 master-0 kubenswrapper[4048]: I0308 03:11:37.667112 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q"] Mar 08 03:11:37.668270 master-0 kubenswrapper[4048]: I0308 03:11:37.668223 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx"] Mar 08 03:11:37.668738 master-0 kubenswrapper[4048]: I0308 03:11:37.668659 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.669023 master-0 kubenswrapper[4048]: I0308 03:11:37.668993 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.671338 master-0 kubenswrapper[4048]: I0308 03:11:37.671262 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq"] Mar 08 03:11:37.672300 master-0 kubenswrapper[4048]: I0308 03:11:37.672233 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.673319 master-0 kubenswrapper[4048]: I0308 03:11:37.673212 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.679152 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.679340 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.679685 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.679907 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.680037 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.680114 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.680294 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:11:37.680656 master-0 kubenswrapper[4048]: I0308 03:11:37.680516 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:11:37.681396 master-0 kubenswrapper[4048]: I0308 03:11:37.680722 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:11:37.681396 master-0 kubenswrapper[4048]: I0308 03:11:37.680758 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:11:37.681396 master-0 kubenswrapper[4048]: I0308 03:11:37.680794 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:11:37.681396 master-0 kubenswrapper[4048]: I0308 03:11:37.680733 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:11:37.681396 master-0 kubenswrapper[4048]: I0308 03:11:37.680928 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.685828 master-0 kubenswrapper[4048]: I0308 03:11:37.685758 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg"] Mar 08 03:11:37.686542 master-0 kubenswrapper[4048]: I0308 03:11:37.686459 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.688365 master-0 kubenswrapper[4048]: I0308 03:11:37.688299 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr"] Mar 08 03:11:37.689105 master-0 kubenswrapper[4048]: I0308 03:11:37.689050 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.690017 master-0 kubenswrapper[4048]: I0308 03:11:37.689969 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:11:37.690398 master-0 kubenswrapper[4048]: I0308 03:11:37.690350 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:11:37.696573 master-0 kubenswrapper[4048]: I0308 03:11:37.694978 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:11:37.696573 master-0 kubenswrapper[4048]: I0308 03:11:37.695158 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:11:37.696573 master-0 kubenswrapper[4048]: I0308 03:11:37.695247 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:11:37.696573 master-0 kubenswrapper[4048]: I0308 03:11:37.695504 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:11:37.696573 master-0 kubenswrapper[4048]: I0308 03:11:37.696020 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.697030 master-0 kubenswrapper[4048]: I0308 03:11:37.696584 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v"] Mar 08 03:11:37.697403 master-0 kubenswrapper[4048]: I0308 03:11:37.697353 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.700567 master-0 kubenswrapper[4048]: I0308 03:11:37.700520 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:11:37.705142 master-0 kubenswrapper[4048]: I0308 03:11:37.704964 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.705142 master-0 kubenswrapper[4048]: I0308 03:11:37.705014 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:11:37.705843 master-0 kubenswrapper[4048]: I0308 03:11:37.705403 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:11:37.705843 master-0 kubenswrapper[4048]: I0308 03:11:37.705604 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.710156 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw"] Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.710741 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4"] Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.711149 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.711375 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj"] Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.711792 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.712514 master-0 kubenswrapper[4048]: I0308 03:11:37.712228 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.716522 master-0 kubenswrapper[4048]: I0308 03:11:37.713230 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.721523 master-0 kubenswrapper[4048]: I0308 03:11:37.718444 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr"] Mar 08 03:11:37.721523 master-0 kubenswrapper[4048]: I0308 03:11:37.719050 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-z45kw"] Mar 08 03:11:37.721523 master-0 kubenswrapper[4048]: I0308 03:11:37.719438 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk"] Mar 08 03:11:37.721523 master-0 kubenswrapper[4048]: I0308 03:11:37.719831 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.724833 master-0 kubenswrapper[4048]: I0308 03:11:37.723934 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.730103 master-0 kubenswrapper[4048]: I0308 03:11:37.727356 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.735427 master-0 kubenswrapper[4048]: I0308 03:11:37.735361 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:11:37.735639 master-0 kubenswrapper[4048]: I0308 03:11:37.735603 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.735772 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.735950 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.736151 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.736316 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.736468 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.736706 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.736968 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.737131 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.737251 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6"] Mar 08 03:11:37.737923 master-0 kubenswrapper[4048]: I0308 03:11:37.737680 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt"] Mar 08 03:11:37.738352 master-0 kubenswrapper[4048]: I0308 03:11:37.737985 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc"] Mar 08 03:11:37.738352 master-0 kubenswrapper[4048]: I0308 03:11:37.738230 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.738352 master-0 kubenswrapper[4048]: I0308 03:11:37.738326 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.738731 master-0 kubenswrapper[4048]: I0308 03:11:37.738693 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.739196 master-0 kubenswrapper[4048]: I0308 03:11:37.738815 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:11:37.740451 master-0 kubenswrapper[4048]: I0308 03:11:37.739576 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.740451 master-0 kubenswrapper[4048]: I0308 03:11:37.739938 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c"] Mar 08 03:11:37.745060 master-0 kubenswrapper[4048]: I0308 03:11:37.745016 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.746087 master-0 kubenswrapper[4048]: I0308 03:11:37.746012 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:11:37.746249 master-0 kubenswrapper[4048]: I0308 03:11:37.746201 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:11:37.746349 master-0 kubenswrapper[4048]: I0308 03:11:37.746324 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:11:37.746447 master-0 kubenswrapper[4048]: I0308 03:11:37.746428 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:11:37.746564 master-0 kubenswrapper[4048]: I0308 03:11:37.746545 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:11:37.747831 master-0 kubenswrapper[4048]: I0308 03:11:37.747803 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749467 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749568 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749612 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749685 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749723 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749761 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749797 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749835 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749865 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749896 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749951 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.749983 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.750017 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.751709 master-0 kubenswrapper[4048]: I0308 03:11:37.750069 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750101 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750132 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750243 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750281 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750316 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750342 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750418 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750451 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750480 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750656 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750686 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750718 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750747 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.752473 master-0 kubenswrapper[4048]: I0308 03:11:37.750779 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.750806 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.750836 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.750911 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.750950 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751133 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751163 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751189 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751219 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751248 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751274 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751308 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751410 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751439 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.751582 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf"] Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.752101 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-r9m2k"] Mar 08 03:11:37.753635 master-0 kubenswrapper[4048]: I0308 03:11:37.752568 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.754435 master-0 kubenswrapper[4048]: I0308 03:11:37.752852 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.754435 master-0 kubenswrapper[4048]: I0308 03:11:37.753052 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.754435 master-0 kubenswrapper[4048]: I0308 03:11:37.753393 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.754435 master-0 kubenswrapper[4048]: I0308 03:11:37.754242 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.755209 master-0 kubenswrapper[4048]: I0308 03:11:37.755131 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.755303 master-0 kubenswrapper[4048]: I0308 03:11:37.755219 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.755303 master-0 kubenswrapper[4048]: I0308 03:11:37.755269 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.755303 master-0 kubenswrapper[4048]: I0308 03:11:37.755303 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.755704 master-0 kubenswrapper[4048]: I0308 03:11:37.755558 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:11:37.755704 master-0 kubenswrapper[4048]: I0308 03:11:37.755659 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:11:37.755813 master-0 kubenswrapper[4048]: I0308 03:11:37.755730 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.755875 master-0 kubenswrapper[4048]: I0308 03:11:37.755823 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:11:37.755933 master-0 kubenswrapper[4048]: I0308 03:11:37.755880 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775"] Mar 08 03:11:37.755933 master-0 kubenswrapper[4048]: I0308 03:11:37.755913 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:11:37.765199 master-0 kubenswrapper[4048]: I0308 03:11:37.764222 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:11:37.765199 master-0 kubenswrapper[4048]: I0308 03:11:37.764407 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:11:37.765199 master-0 kubenswrapper[4048]: I0308 03:11:37.764477 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:11:37.765199 master-0 kubenswrapper[4048]: I0308 03:11:37.764675 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.766089 master-0 kubenswrapper[4048]: I0308 03:11:37.766032 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:11:37.766089 master-0 kubenswrapper[4048]: I0308 03:11:37.766063 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.766595 master-0 kubenswrapper[4048]: I0308 03:11:37.766552 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:11:37.766665 master-0 kubenswrapper[4048]: I0308 03:11:37.766652 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:11:37.766835 master-0 kubenswrapper[4048]: I0308 03:11:37.766789 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:11:37.767016 master-0 kubenswrapper[4048]: I0308 03:11:37.766979 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.769354 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.769518 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8"] Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.769974 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.770102 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.770308 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x"] Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.770471 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.770676 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.770815 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:37.771238 master-0 kubenswrapper[4048]: I0308 03:11:37.771092 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:37.771715 master-0 kubenswrapper[4048]: I0308 03:11:37.771601 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7"] Mar 08 03:11:37.772475 master-0 kubenswrapper[4048]: I0308 03:11:37.772030 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.772475 master-0 kubenswrapper[4048]: I0308 03:11:37.772030 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx"] Mar 08 03:11:37.773355 master-0 kubenswrapper[4048]: I0308 03:11:37.773309 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q"] Mar 08 03:11:37.773474 master-0 kubenswrapper[4048]: I0308 03:11:37.773397 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq"] Mar 08 03:11:37.776756 master-0 kubenswrapper[4048]: I0308 03:11:37.775905 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw"] Mar 08 03:11:37.776756 master-0 kubenswrapper[4048]: I0308 03:11:37.775940 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg"] Mar 08 03:11:37.776756 master-0 kubenswrapper[4048]: I0308 03:11:37.776440 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:11:37.777179 master-0 kubenswrapper[4048]: I0308 03:11:37.777064 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.778685 master-0 kubenswrapper[4048]: I0308 03:11:37.778374 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-z45kw"] Mar 08 03:11:37.783451 master-0 kubenswrapper[4048]: I0308 03:11:37.783001 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:11:37.783451 master-0 kubenswrapper[4048]: I0308 03:11:37.783190 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:11:37.783451 master-0 kubenswrapper[4048]: I0308 03:11:37.783425 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.783731 master-0 kubenswrapper[4048]: I0308 03:11:37.783515 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.783731 master-0 kubenswrapper[4048]: I0308 03:11:37.783561 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:11:37.783731 master-0 kubenswrapper[4048]: I0308 03:11:37.783670 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:11:37.783731 master-0 kubenswrapper[4048]: I0308 03:11:37.783713 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:11:37.783877 master-0 kubenswrapper[4048]: I0308 03:11:37.783792 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:11:37.783877 master-0 kubenswrapper[4048]: I0308 03:11:37.783839 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:11:37.783972 master-0 kubenswrapper[4048]: I0308 03:11:37.783885 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:11:37.784014 master-0 kubenswrapper[4048]: I0308 03:11:37.783979 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:11:37.784168 master-0 kubenswrapper[4048]: I0308 03:11:37.784079 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:11:37.787633 master-0 kubenswrapper[4048]: I0308 03:11:37.784812 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:11:37.787633 master-0 kubenswrapper[4048]: I0308 03:11:37.784938 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:11:37.787633 master-0 kubenswrapper[4048]: I0308 03:11:37.785079 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:11:37.787633 master-0 kubenswrapper[4048]: I0308 03:11:37.783426 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:11:37.787633 master-0 kubenswrapper[4048]: I0308 03:11:37.787603 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr"] Mar 08 03:11:37.789369 master-0 kubenswrapper[4048]: I0308 03:11:37.789294 4048 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-g86jc"] Mar 08 03:11:37.791369 master-0 kubenswrapper[4048]: I0308 03:11:37.791267 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4"] Mar 08 03:11:37.792326 master-0 kubenswrapper[4048]: I0308 03:11:37.792228 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.795403 master-0 kubenswrapper[4048]: I0308 03:11:37.794057 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr"] Mar 08 03:11:37.795403 master-0 kubenswrapper[4048]: I0308 03:11:37.794538 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:11:37.795624 master-0 kubenswrapper[4048]: I0308 03:11:37.795443 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:11:37.800881 master-0 kubenswrapper[4048]: I0308 03:11:37.800760 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:11:37.801116 master-0 kubenswrapper[4048]: I0308 03:11:37.801080 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk"] Mar 08 03:11:37.802034 master-0 kubenswrapper[4048]: I0308 03:11:37.801963 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:11:37.803588 master-0 kubenswrapper[4048]: I0308 03:11:37.803374 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c"] Mar 08 03:11:37.806675 master-0 kubenswrapper[4048]: I0308 03:11:37.806323 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8"] Mar 08 03:11:37.812508 master-0 kubenswrapper[4048]: I0308 03:11:37.811300 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:11:37.820175 master-0 kubenswrapper[4048]: I0308 03:11:37.820138 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt"] Mar 08 03:11:37.822783 master-0 kubenswrapper[4048]: I0308 03:11:37.822700 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc"] Mar 08 03:11:37.823421 master-0 kubenswrapper[4048]: I0308 03:11:37.823399 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6"] Mar 08 03:11:37.824212 master-0 kubenswrapper[4048]: I0308 03:11:37.824176 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v"] Mar 08 03:11:37.825307 master-0 kubenswrapper[4048]: I0308 03:11:37.825153 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj"] Mar 08 03:11:37.834114 master-0 kubenswrapper[4048]: I0308 03:11:37.834039 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf"] Mar 08 03:11:37.839909 master-0 kubenswrapper[4048]: I0308 03:11:37.839830 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7"] Mar 08 03:11:37.839998 master-0 kubenswrapper[4048]: I0308 03:11:37.839913 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775"] Mar 08 03:11:37.843135 master-0 kubenswrapper[4048]: I0308 03:11:37.841859 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x"] Mar 08 03:11:37.843135 master-0 kubenswrapper[4048]: I0308 03:11:37.842703 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-r9m2k"] Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.864910 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.864966 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.864999 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865030 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865057 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865079 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865100 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865121 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865146 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865171 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865195 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865222 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865253 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: I0308 03:11:37.865283 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.871612 master-0 kubenswrapper[4048]: E0308 03:11:37.865506 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: E0308 03:11:37.865596 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.365569651 +0000 UTC m=+137.331042222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: E0308 03:11:37.866843 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: E0308 03:11:37.866941 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.366908888 +0000 UTC m=+137.332381499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867021 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867072 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867114 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867154 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867194 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867228 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867266 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867302 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867336 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867370 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.873257 master-0 kubenswrapper[4048]: I0308 03:11:37.867406 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867447 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867479 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867544 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867579 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867618 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867650 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867687 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867722 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867742 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867756 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867796 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.867829 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.869832 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.869877 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.873844 master-0 kubenswrapper[4048]: I0308 03:11:37.869948 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.869988 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870039 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870080 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870116 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870153 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870194 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870233 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870269 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.870302 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: E0308 03:11:37.871264 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: E0308 03:11:37.871313 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.871432 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.868581 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: I0308 03:11:37.871554 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.874537 master-0 kubenswrapper[4048]: E0308 03:11:37.871741 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.371724194 +0000 UTC m=+137.337196765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: E0308 03:11:37.871786 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.371772226 +0000 UTC m=+137.337244797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: E0308 03:11:37.871828 4048 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: E0308 03:11:37.871831 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: E0308 03:11:37.871853 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.371844588 +0000 UTC m=+137.337317159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: E0308 03:11:37.871865 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.371859549 +0000 UTC m=+137.337332120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.872344 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.872651 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.872703 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.873109 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.873504 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.873938 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.874127 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.874168 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.874183 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.875087 master-0 kubenswrapper[4048]: I0308 03:11:37.874251 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874287 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874324 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874363 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874401 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874437 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874472 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874538 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874595 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874629 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874665 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874718 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874753 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874787 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874828 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.876287 master-0 kubenswrapper[4048]: I0308 03:11:37.874869 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.875247 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.875801 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: E0308 03:11:37.875932 4048 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: E0308 03:11:37.875997 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.375974451 +0000 UTC m=+137.341447062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.876241 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.876341 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.876725 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.876914 master-0 kubenswrapper[4048]: I0308 03:11:37.876890 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.874910 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.876996 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.877032 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.877047 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.877073 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.877291 master-0 kubenswrapper[4048]: I0308 03:11:37.877183 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.878286 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.878768 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.878835 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.878874 4048 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.879017 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.879068 master-0 kubenswrapper[4048]: I0308 03:11:37.879051 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.879636 master-0 kubenswrapper[4048]: I0308 03:11:37.879371 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.879636 master-0 kubenswrapper[4048]: I0308 03:11:37.879395 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.879636 master-0 kubenswrapper[4048]: I0308 03:11:37.879514 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.882778 master-0 kubenswrapper[4048]: I0308 03:11:37.882722 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.895461 master-0 kubenswrapper[4048]: I0308 03:11:37.890999 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.895461 master-0 kubenswrapper[4048]: I0308 03:11:37.894782 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.900701 master-0 kubenswrapper[4048]: I0308 03:11:37.900644 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.901645 master-0 kubenswrapper[4048]: I0308 03:11:37.901419 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:37.901645 master-0 kubenswrapper[4048]: I0308 03:11:37.901517 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:37.901645 master-0 kubenswrapper[4048]: I0308 03:11:37.901611 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:37.902575 master-0 kubenswrapper[4048]: I0308 03:11:37.902551 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:37.904048 master-0 kubenswrapper[4048]: I0308 03:11:37.903690 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.904048 master-0 kubenswrapper[4048]: I0308 03:11:37.904010 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:37.905004 master-0 kubenswrapper[4048]: I0308 03:11:37.904959 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:37.906332 master-0 kubenswrapper[4048]: I0308 03:11:37.906307 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.907820 master-0 kubenswrapper[4048]: I0308 03:11:37.907789 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:37.908381 master-0 kubenswrapper[4048]: I0308 03:11:37.908357 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:37.908636 master-0 kubenswrapper[4048]: I0308 03:11:37.908608 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:37.917230 master-0 kubenswrapper[4048]: I0308 03:11:37.917190 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:37.920709 master-0 kubenswrapper[4048]: I0308 03:11:37.920682 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:37.927175 master-0 kubenswrapper[4048]: I0308 03:11:37.927151 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.940623 master-0 kubenswrapper[4048]: I0308 03:11:37.940587 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:37.947633 master-0 kubenswrapper[4048]: I0308 03:11:37.947570 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:37.969011 master-0 kubenswrapper[4048]: I0308 03:11:37.968969 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:37.980544 master-0 kubenswrapper[4048]: I0308 03:11:37.980302 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.980544 master-0 kubenswrapper[4048]: I0308 03:11:37.980351 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.980544 master-0 kubenswrapper[4048]: I0308 03:11:37.980407 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.980544 master-0 kubenswrapper[4048]: I0308 03:11:37.980436 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.980544 master-0 kubenswrapper[4048]: I0308 03:11:37.980444 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.980773 master-0 kubenswrapper[4048]: I0308 03:11:37.980566 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.980773 master-0 kubenswrapper[4048]: I0308 03:11:37.980630 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.980773 master-0 kubenswrapper[4048]: I0308 03:11:37.980684 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:37.980773 master-0 kubenswrapper[4048]: I0308 03:11:37.980735 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.980936 master-0 kubenswrapper[4048]: I0308 03:11:37.980803 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:37.980978 master-0 kubenswrapper[4048]: E0308 03:11:37.980955 4048 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981124 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981164 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981227 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981251 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981285 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981310 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981347 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981371 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981395 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981415 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981445 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981468 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: E0308 03:11:37.981520 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.481498482 +0000 UTC m=+137.446971063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:37.983493 master-0 kubenswrapper[4048]: I0308 03:11:37.981572 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.981651 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.981689 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.981739 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.981849 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.982049 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.982634 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982710 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982728 4048 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982767 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.482750195 +0000 UTC m=+137.448222766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982784 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982784 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.482775236 +0000 UTC m=+137.448247807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.982832 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.482820908 +0000 UTC m=+137.448293709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: I0308 03:11:37.983088 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.983347 4048 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:37.984013 master-0 kubenswrapper[4048]: E0308 03:11:37.983384 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.483370857 +0000 UTC m=+137.448843428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:37.984557 master-0 kubenswrapper[4048]: I0308 03:11:37.983760 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: E0308 03:11:37.990617 4048 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: E0308 03:11:37.990700 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:38.490680019 +0000 UTC m=+137.456152590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: I0308 03:11:37.991297 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: I0308 03:11:37.992834 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: I0308 03:11:37.993040 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: I0308 03:11:37.993365 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:37.993757 master-0 kubenswrapper[4048]: I0308 03:11:37.993662 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:37.997509 master-0 kubenswrapper[4048]: I0308 03:11:37.996379 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:38.006768 master-0 kubenswrapper[4048]: I0308 03:11:38.005467 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:38.014655 master-0 kubenswrapper[4048]: I0308 03:11:38.014573 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:38.027179 master-0 kubenswrapper[4048]: I0308 03:11:38.027141 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:38.081364 master-0 kubenswrapper[4048]: I0308 03:11:38.077209 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:38.081364 master-0 kubenswrapper[4048]: I0308 03:11:38.080137 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:38.081364 master-0 kubenswrapper[4048]: I0308 03:11:38.080468 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:38.101840 master-0 kubenswrapper[4048]: I0308 03:11:38.101788 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:38.108677 master-0 kubenswrapper[4048]: I0308 03:11:38.108637 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:38.114952 master-0 kubenswrapper[4048]: I0308 03:11:38.114912 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:38.125815 master-0 kubenswrapper[4048]: I0308 03:11:38.122350 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:38.132877 master-0 kubenswrapper[4048]: I0308 03:11:38.132447 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:38.133154 master-0 kubenswrapper[4048]: I0308 03:11:38.133070 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:38.144789 master-0 kubenswrapper[4048]: I0308 03:11:38.144749 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:38.147958 master-0 kubenswrapper[4048]: I0308 03:11:38.147919 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:38.281843 master-0 kubenswrapper[4048]: I0308 03:11:38.281195 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr"] Mar 08 03:11:38.317187 master-0 kubenswrapper[4048]: I0308 03:11:38.317112 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:38.320332 master-0 kubenswrapper[4048]: I0308 03:11:38.320304 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:38.325100 master-0 kubenswrapper[4048]: I0308 03:11:38.324267 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:38.325100 master-0 kubenswrapper[4048]: I0308 03:11:38.324802 4048 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391097 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391149 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391299 4048 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391510 4048 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391560 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.391535432 +0000 UTC m=+138.357008003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391560 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391581 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.391573553 +0000 UTC m=+138.357046124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391714 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391673 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391790 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391828 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.391813382 +0000 UTC m=+138.357285953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391844 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.391837422 +0000 UTC m=+138.357309993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391873 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391894 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: I0308 03:11:38.391938 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:38.392533 master-0 kubenswrapper[4048]: E0308 03:11:38.391999 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:38.393086 master-0 kubenswrapper[4048]: E0308 03:11:38.392018 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.392013108 +0000 UTC m=+138.357485679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:38.393086 master-0 kubenswrapper[4048]: E0308 03:11:38.392049 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:38.393086 master-0 kubenswrapper[4048]: E0308 03:11:38.392064 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.3920595 +0000 UTC m=+138.357532071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:38.393086 master-0 kubenswrapper[4048]: E0308 03:11:38.392096 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:38.393086 master-0 kubenswrapper[4048]: E0308 03:11:38.392114 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.392106812 +0000 UTC m=+138.357579383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:38.402870 master-0 kubenswrapper[4048]: I0308 03:11:38.402792 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerStarted","Data":"6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25"} Mar 08 03:11:38.427097 master-0 kubenswrapper[4048]: I0308 03:11:38.426802 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:38.442866 master-0 kubenswrapper[4048]: I0308 03:11:38.442822 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:38.452949 master-0 kubenswrapper[4048]: W0308 03:11:38.452897 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275be8d3_df30_46f7_9d0a_806e404dfd57.slice/crio-fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575 WatchSource:0}: Error finding container fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575: Status 404 returned error can't find the container with id fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575 Mar 08 03:11:38.492701 master-0 kubenswrapper[4048]: I0308 03:11:38.492423 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:38.492701 master-0 kubenswrapper[4048]: I0308 03:11:38.492694 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:38.492701 master-0 kubenswrapper[4048]: I0308 03:11:38.492716 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:38.492972 master-0 kubenswrapper[4048]: I0308 03:11:38.492766 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:38.492972 master-0 kubenswrapper[4048]: I0308 03:11:38.492807 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:38.492972 master-0 kubenswrapper[4048]: I0308 03:11:38.492832 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:38.492972 master-0 kubenswrapper[4048]: E0308 03:11:38.492642 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:38.492972 master-0 kubenswrapper[4048]: E0308 03:11:38.492961 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.492948212 +0000 UTC m=+138.458420783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493001 4048 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493020 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.493013784 +0000 UTC m=+138.458486355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493053 4048 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493068 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.493063646 +0000 UTC m=+138.458536217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493102 4048 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:38.493133 master-0 kubenswrapper[4048]: E0308 03:11:38.493119 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.493113317 +0000 UTC m=+138.458585878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:38.493295 master-0 kubenswrapper[4048]: E0308 03:11:38.493150 4048 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:38.493295 master-0 kubenswrapper[4048]: E0308 03:11:38.493166 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.493161519 +0000 UTC m=+138.458634080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:38.493295 master-0 kubenswrapper[4048]: E0308 03:11:38.492915 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:38.493295 master-0 kubenswrapper[4048]: E0308 03:11:38.493185 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:39.4931806 +0000 UTC m=+138.458653171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:38.505863 master-0 kubenswrapper[4048]: I0308 03:11:38.505284 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6"] Mar 08 03:11:38.513994 master-0 kubenswrapper[4048]: I0308 03:11:38.513132 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt"] Mar 08 03:11:38.516443 master-0 kubenswrapper[4048]: I0308 03:11:38.516386 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk"] Mar 08 03:11:38.522646 master-0 kubenswrapper[4048]: W0308 03:11:38.521813 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83ab56c_e28d_4e82_ae8f_92649a1448ed.slice/crio-9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f WatchSource:0}: Error finding container 9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f: Status 404 returned error can't find the container with id 9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f Mar 08 03:11:38.522784 master-0 kubenswrapper[4048]: W0308 03:11:38.522747 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadbbe97_2a03_40da_846d_252e29661f67.slice/crio-11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b WatchSource:0}: Error finding container 11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b: Status 404 returned error can't find the container with id 11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b Mar 08 03:11:38.525027 master-0 kubenswrapper[4048]: W0308 03:11:38.524691 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3178dfc0_a35e_418e_a954_cd919b8af88c.slice/crio-755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba WatchSource:0}: Error finding container 755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba: Status 404 returned error can't find the container with id 755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba Mar 08 03:11:38.595464 master-0 kubenswrapper[4048]: I0308 03:11:38.595407 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8"] Mar 08 03:11:38.612510 master-0 kubenswrapper[4048]: I0308 03:11:38.612463 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr"] Mar 08 03:11:38.615977 master-0 kubenswrapper[4048]: I0308 03:11:38.615930 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c"] Mar 08 03:11:38.624920 master-0 kubenswrapper[4048]: I0308 03:11:38.624879 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq"] Mar 08 03:11:38.628249 master-0 kubenswrapper[4048]: I0308 03:11:38.628209 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4"] Mar 08 03:11:38.631690 master-0 kubenswrapper[4048]: I0308 03:11:38.631586 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v"] Mar 08 03:11:38.632661 master-0 kubenswrapper[4048]: I0308 03:11:38.632369 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx"] Mar 08 03:11:38.649550 master-0 kubenswrapper[4048]: I0308 03:11:38.649137 4048 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7"] Mar 08 03:11:38.653967 master-0 kubenswrapper[4048]: W0308 03:11:38.653925 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0192f3_2e60_42c6_9836_c70a9fa407d5.slice/crio-e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3 WatchSource:0}: Error finding container e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3: Status 404 returned error can't find the container with id e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3 Mar 08 03:11:38.661705 master-0 kubenswrapper[4048]: W0308 03:11:38.661672 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf08a644f_3b61_46a7_a7b6_a9f7f2f7d266.slice/crio-ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa WatchSource:0}: Error finding container ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa: Status 404 returned error can't find the container with id ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa Mar 08 03:11:38.662519 master-0 kubenswrapper[4048]: W0308 03:11:38.662080 4048 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70fba73e_c201_4866_bc69_64892ea5bdca.slice/crio-23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab WatchSource:0}: Error finding container 23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab: Status 404 returned error can't find the container with id 23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: E0308 03:11:38.667416 4048 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: echo "Copying system trust bundle" Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: fi Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d3571ade02a7c61123d62c53fda6a57031a52c058c0571759dc09f96b23978f,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.34_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxqnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-7c6989d6c4-zqlnx_openshift-authentication-operator(f08a644f-3b61-46a7-a7b6-a9f7f2f7d266): ErrImagePull: pull QPS exceeded Mar 08 03:11:38.667445 master-0 kubenswrapper[4048]: > logger="UnhandledError" Mar 08 03:11:38.669043 master-0 kubenswrapper[4048]: E0308 03:11:38.669012 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" Mar 08 03:11:38.859616 master-0 kubenswrapper[4048]: I0308 03:11:38.859058 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:38.860211 master-0 kubenswrapper[4048]: I0308 03:11:38.859081 4048 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:38.862185 master-0 kubenswrapper[4048]: I0308 03:11:38.862004 4048 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:11:38.862957 master-0 kubenswrapper[4048]: I0308 03:11:38.862764 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:11:38.872057 master-0 kubenswrapper[4048]: I0308 03:11:38.871851 4048 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:11:39.409116 master-0 kubenswrapper[4048]: I0308 03:11:39.409028 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerStarted","Data":"9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f"} Mar 08 03:11:39.409918 master-0 kubenswrapper[4048]: I0308 03:11:39.409821 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:39.409918 master-0 kubenswrapper[4048]: I0308 03:11:39.409859 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:39.409918 master-0 kubenswrapper[4048]: I0308 03:11:39.409891 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:39.410192 master-0 kubenswrapper[4048]: E0308 03:11:39.410041 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:39.410192 master-0 kubenswrapper[4048]: E0308 03:11:39.410120 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.410099602 +0000 UTC m=+140.375572173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:39.410192 master-0 kubenswrapper[4048]: E0308 03:11:39.410125 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:39.410192 master-0 kubenswrapper[4048]: E0308 03:11:39.410173 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.410156384 +0000 UTC m=+140.375629025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:39.410329 master-0 kubenswrapper[4048]: E0308 03:11:39.410193 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:39.410458 master-0 kubenswrapper[4048]: I0308 03:11:39.410367 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: E0308 03:11:39.410426 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: E0308 03:11:39.410447 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.410431243 +0000 UTC m=+140.375903814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: I0308 03:11:39.410635 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: I0308 03:11:39.410665 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: E0308 03:11:39.410687 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.410665221 +0000 UTC m=+140.376137792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: I0308 03:11:39.410715 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: E0308 03:11:39.410724 4048 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:39.410765 master-0 kubenswrapper[4048]: E0308 03:11:39.410751 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.410742134 +0000 UTC m=+140.376214835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:39.411035 master-0 kubenswrapper[4048]: E0308 03:11:39.410986 4048 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:39.411035 master-0 kubenswrapper[4048]: E0308 03:11:39.411027 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.411018713 +0000 UTC m=+140.376491284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:39.411286 master-0 kubenswrapper[4048]: E0308 03:11:39.411178 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:39.411286 master-0 kubenswrapper[4048]: E0308 03:11:39.411273 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.411263462 +0000 UTC m=+140.376736033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:39.412113 master-0 kubenswrapper[4048]: I0308 03:11:39.411614 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerStarted","Data":"ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa"} Mar 08 03:11:39.413417 master-0 kubenswrapper[4048]: I0308 03:11:39.413197 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerStarted","Data":"a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c"} Mar 08 03:11:39.413417 master-0 kubenswrapper[4048]: E0308 03:11:39.413350 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" Mar 08 03:11:39.414263 master-0 kubenswrapper[4048]: I0308 03:11:39.414222 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerStarted","Data":"ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21"} Mar 08 03:11:39.418989 master-0 kubenswrapper[4048]: I0308 03:11:39.418960 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerStarted","Data":"e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3"} Mar 08 03:11:39.423474 master-0 kubenswrapper[4048]: I0308 03:11:39.423410 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g86jc" event={"ID":"275be8d3-df30-46f7-9d0a-806e404dfd57","Type":"ContainerStarted","Data":"fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575"} Mar 08 03:11:39.427717 master-0 kubenswrapper[4048]: I0308 03:11:39.425749 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerStarted","Data":"11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b"} Mar 08 03:11:39.429561 master-0 kubenswrapper[4048]: I0308 03:11:39.428646 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerStarted","Data":"a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0"} Mar 08 03:11:39.429561 master-0 kubenswrapper[4048]: I0308 03:11:39.428674 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerStarted","Data":"755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba"} Mar 08 03:11:39.433056 master-0 kubenswrapper[4048]: I0308 03:11:39.433008 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerStarted","Data":"23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab"} Mar 08 03:11:39.436111 master-0 kubenswrapper[4048]: I0308 03:11:39.436080 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerStarted","Data":"47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d"} Mar 08 03:11:39.437579 master-0 kubenswrapper[4048]: I0308 03:11:39.437424 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerStarted","Data":"f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a"} Mar 08 03:11:39.441599 master-0 kubenswrapper[4048]: I0308 03:11:39.441548 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" event={"ID":"ba9496ed-060e-4118-9da6-89b82bd49263","Type":"ContainerStarted","Data":"92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b"} Mar 08 03:11:39.447758 master-0 kubenswrapper[4048]: I0308 03:11:39.447703 4048 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" podStartSLOduration=104.447685169 podStartE2EDuration="1m44.447685169s" podCreationTimestamp="2026-03-08 03:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:11:39.447473991 +0000 UTC m=+138.412946562" watchObservedRunningTime="2026-03-08 03:11:39.447685169 +0000 UTC m=+138.413157740" Mar 08 03:11:39.512168 master-0 kubenswrapper[4048]: I0308 03:11:39.512067 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:39.512168 master-0 kubenswrapper[4048]: I0308 03:11:39.512121 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:39.513065 master-0 kubenswrapper[4048]: E0308 03:11:39.512897 4048 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:39.513361 master-0 kubenswrapper[4048]: E0308 03:11:39.513249 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: E0308 03:11:39.516595 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.516568316 +0000 UTC m=+140.482040887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: E0308 03:11:39.516617 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.516610767 +0000 UTC m=+140.482083338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: I0308 03:11:39.516697 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: I0308 03:11:39.516749 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: I0308 03:11:39.516768 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: I0308 03:11:39.516874 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: E0308 03:11:39.516998 4048 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:39.517083 master-0 kubenswrapper[4048]: E0308 03:11:39.517030 4048 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:39.518527 master-0 kubenswrapper[4048]: E0308 03:11:39.517372 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.517020931 +0000 UTC m=+140.482493502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:39.518527 master-0 kubenswrapper[4048]: E0308 03:11:39.517392 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.517385704 +0000 UTC m=+140.482858275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:39.518527 master-0 kubenswrapper[4048]: E0308 03:11:39.517449 4048 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:39.518527 master-0 kubenswrapper[4048]: E0308 03:11:39.517470 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.517463887 +0000 UTC m=+140.482936458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:39.519199 master-0 kubenswrapper[4048]: E0308 03:11:39.519160 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:39.519271 master-0 kubenswrapper[4048]: E0308 03:11:39.519256 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:41.519236048 +0000 UTC m=+140.484708619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:40.447648 master-0 kubenswrapper[4048]: E0308 03:11:40.447601 4048 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" Mar 08 03:11:41.451059 master-0 kubenswrapper[4048]: I0308 03:11:41.450948 4048 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="d7c267c7d1ad40b10c4f9d19008802c751a1cdd3364f0744ee013a61bcad5ca6" exitCode=0 Mar 08 03:11:41.451059 master-0 kubenswrapper[4048]: I0308 03:11:41.450997 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerDied","Data":"d7c267c7d1ad40b10c4f9d19008802c751a1cdd3364f0744ee013a61bcad5ca6"} Mar 08 03:11:41.454137 master-0 kubenswrapper[4048]: I0308 03:11:41.454089 4048 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="2ecfec74d59cc3d0b000968048ba4bb7931b60227ed12aaa53445141ec092ff9" exitCode=0 Mar 08 03:11:41.454193 master-0 kubenswrapper[4048]: I0308 03:11:41.454135 4048 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerDied","Data":"2ecfec74d59cc3d0b000968048ba4bb7931b60227ed12aaa53445141ec092ff9"} Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824427 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824503 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824554 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824578 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: E0308 03:11:41.824607 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824626 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: E0308 03:11:41.824659 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.82464241 +0000 UTC m=+144.790114981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824680 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824741 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824852 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824874 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824892 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824911 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824929 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: I0308 03:11:41.824957 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:41.825126 master-0 kubenswrapper[4048]: E0308 03:11:41.824687 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:41.826203 master-0 kubenswrapper[4048]: E0308 03:11:41.825133 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.825122846 +0000 UTC m=+144.790595417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:41.834039 master-0 kubenswrapper[4048]: E0308 03:11:41.824757 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:41.834039 master-0 kubenswrapper[4048]: E0308 03:11:41.824838 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:41.834245 master-0 kubenswrapper[4048]: E0308 03:11:41.825037 4048 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:41.834245 master-0 kubenswrapper[4048]: E0308 03:11:41.834122 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.834092775 +0000 UTC m=+144.799565346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:41.834245 master-0 kubenswrapper[4048]: E0308 03:11:41.834195 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.834155347 +0000 UTC m=+144.799627918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:41.834245 master-0 kubenswrapper[4048]: E0308 03:11:41.834229 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.834210199 +0000 UTC m=+144.799682780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:41.834245 master-0 kubenswrapper[4048]: E0308 03:11:41.825053 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:41.836010 master-0 kubenswrapper[4048]: E0308 03:11:41.834271 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.834262251 +0000 UTC m=+144.799734832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:41.836010 master-0 kubenswrapper[4048]: E0308 03:11:41.825102 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:41.836010 master-0 kubenswrapper[4048]: E0308 03:11:41.834341 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.834322193 +0000 UTC m=+144.799794764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:41.836775 master-0 kubenswrapper[4048]: E0308 03:11:41.836728 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:41.836828 master-0 kubenswrapper[4048]: E0308 03:11:41.836804 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.836787088 +0000 UTC m=+144.802259649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:41.836923 master-0 kubenswrapper[4048]: E0308 03:11:41.836883 4048 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:41.836978 master-0 kubenswrapper[4048]: E0308 03:11:41.836961 4048 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:41.837011 master-0 kubenswrapper[4048]: E0308 03:11:41.836994 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.836959504 +0000 UTC m=+144.802432075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:41.837040 master-0 kubenswrapper[4048]: E0308 03:11:41.837021 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.837009986 +0000 UTC m=+144.802482557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:41.837068 master-0 kubenswrapper[4048]: E0308 03:11:41.837044 4048 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:41.837100 master-0 kubenswrapper[4048]: E0308 03:11:41.837088 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.837072708 +0000 UTC m=+144.802545279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:41.837189 master-0 kubenswrapper[4048]: E0308 03:11:41.837153 4048 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:41.837242 master-0 kubenswrapper[4048]: E0308 03:11:41.837212 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.837198212 +0000 UTC m=+144.802670783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:41.837351 master-0 kubenswrapper[4048]: E0308 03:11:41.837309 4048 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:41.837407 master-0 kubenswrapper[4048]: E0308 03:11:41.837368 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:45.837354038 +0000 UTC m=+144.802826619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:45.692748 master-0 kubenswrapper[4048]: I0308 03:11:45.692425 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:45.692748 master-0 kubenswrapper[4048]: E0308 03:11:45.692640 4048 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:45.693347 master-0 kubenswrapper[4048]: E0308 03:11:45.692849 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:12:49.69283044 +0000 UTC m=+208.658303011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:45.893704 master-0 kubenswrapper[4048]: I0308 03:11:45.893643 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:45.893704 master-0 kubenswrapper[4048]: I0308 03:11:45.893693 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: I0308 03:11:45.893726 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: E0308 03:11:45.893819 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: E0308 03:11:45.893869 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.893854717 +0000 UTC m=+152.859327288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: I0308 03:11:45.893886 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: E0308 03:11:45.893819 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:45.893925 master-0 kubenswrapper[4048]: E0308 03:11:45.893917 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.893955 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.89393564 +0000 UTC m=+152.859408211 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.893959 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.893973 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.893965281 +0000 UTC m=+152.859437852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: I0308 03:11:45.893914 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.893991 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.893981752 +0000 UTC m=+152.859454323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.893992 4048 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: I0308 03:11:45.894008 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.894026 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894014223 +0000 UTC m=+152.859486804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: I0308 03:11:45.894048 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.894066 4048 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: I0308 03:11:45.894080 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:45.894090 master-0 kubenswrapper[4048]: E0308 03:11:45.894095 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894086055 +0000 UTC m=+152.859558626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894132 4048 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894136 4048 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894170 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894156978 +0000 UTC m=+152.859629589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: I0308 03:11:45.894197 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: I0308 03:11:45.894228 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: I0308 03:11:45.894245 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: I0308 03:11:45.894261 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: I0308 03:11:45.894278 4048 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894319 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894312763 +0000 UTC m=+152.859785334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894355 4048 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894370 4048 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894393 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894380925 +0000 UTC m=+152.859853496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894407 4048 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894408 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894401546 +0000 UTC m=+152.859874117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:45.894461 master-0 kubenswrapper[4048]: E0308 03:11:45.894437 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894428787 +0000 UTC m=+152.859901358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:45.894914 master-0 kubenswrapper[4048]: E0308 03:11:45.894467 4048 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:45.894914 master-0 kubenswrapper[4048]: E0308 03:11:45.894520 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.89451101 +0000 UTC m=+152.859983581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:45.894914 master-0 kubenswrapper[4048]: E0308 03:11:45.894547 4048 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:45.894914 master-0 kubenswrapper[4048]: E0308 03:11:45.894603 4048 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.894587602 +0000 UTC m=+152.860060203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:47.250806 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 03:11:47.277809 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 03:11:47.278281 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 03:11:47.284088 master-0 systemd[1]: kubelet.service: Consumed 10.876s CPU time. Mar 08 03:11:47.305963 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:11:47.455868 master-0 kubenswrapper[7648]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:11:47.457027 master-0 kubenswrapper[7648]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:11:47.457094 master-0 kubenswrapper[7648]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:11:47.457169 master-0 kubenswrapper[7648]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:11:47.457228 master-0 kubenswrapper[7648]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:11:47.457285 master-0 kubenswrapper[7648]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:11:47.457591 master-0 kubenswrapper[7648]: I0308 03:11:47.457458 7648 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:11:47.460657 master-0 kubenswrapper[7648]: W0308 03:11:47.460639 7648 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:11:47.460758 master-0 kubenswrapper[7648]: W0308 03:11:47.460746 7648 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:11:47.460829 master-0 kubenswrapper[7648]: W0308 03:11:47.460818 7648 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:11:47.460893 master-0 kubenswrapper[7648]: W0308 03:11:47.460883 7648 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:11:47.460956 master-0 kubenswrapper[7648]: W0308 03:11:47.460946 7648 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:11:47.461028 master-0 kubenswrapper[7648]: W0308 03:11:47.461017 7648 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:11:47.461170 master-0 kubenswrapper[7648]: W0308 03:11:47.461159 7648 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:11:47.461235 master-0 kubenswrapper[7648]: W0308 03:11:47.461225 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:11:47.461297 master-0 kubenswrapper[7648]: W0308 03:11:47.461287 7648 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:11:47.461358 master-0 kubenswrapper[7648]: W0308 03:11:47.461348 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:11:47.461420 master-0 kubenswrapper[7648]: W0308 03:11:47.461410 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:11:47.461500 master-0 kubenswrapper[7648]: W0308 03:11:47.461472 7648 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:11:47.461578 master-0 kubenswrapper[7648]: W0308 03:11:47.461566 7648 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:11:47.461643 master-0 kubenswrapper[7648]: W0308 03:11:47.461632 7648 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:11:47.461707 master-0 kubenswrapper[7648]: W0308 03:11:47.461696 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:11:47.461771 master-0 kubenswrapper[7648]: W0308 03:11:47.461760 7648 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:11:47.461833 master-0 kubenswrapper[7648]: W0308 03:11:47.461823 7648 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:11:47.461895 master-0 kubenswrapper[7648]: W0308 03:11:47.461884 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:11:47.461959 master-0 kubenswrapper[7648]: W0308 03:11:47.461948 7648 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:11:47.462025 master-0 kubenswrapper[7648]: W0308 03:11:47.462015 7648 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:11:47.462087 master-0 kubenswrapper[7648]: W0308 03:11:47.462077 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:11:47.462148 master-0 kubenswrapper[7648]: W0308 03:11:47.462138 7648 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:11:47.462213 master-0 kubenswrapper[7648]: W0308 03:11:47.462203 7648 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:11:47.462277 master-0 kubenswrapper[7648]: W0308 03:11:47.462265 7648 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:11:47.462364 master-0 kubenswrapper[7648]: W0308 03:11:47.462352 7648 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:11:47.462448 master-0 kubenswrapper[7648]: W0308 03:11:47.462433 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:11:47.462558 master-0 kubenswrapper[7648]: W0308 03:11:47.462546 7648 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:11:47.462627 master-0 kubenswrapper[7648]: W0308 03:11:47.462616 7648 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:11:47.462691 master-0 kubenswrapper[7648]: W0308 03:11:47.462680 7648 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:11:47.462760 master-0 kubenswrapper[7648]: W0308 03:11:47.462749 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:11:47.462823 master-0 kubenswrapper[7648]: W0308 03:11:47.462813 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:11:47.462907 master-0 kubenswrapper[7648]: W0308 03:11:47.462892 7648 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:11:47.463001 master-0 kubenswrapper[7648]: W0308 03:11:47.462987 7648 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:11:47.463090 master-0 kubenswrapper[7648]: W0308 03:11:47.463075 7648 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:11:47.463162 master-0 kubenswrapper[7648]: W0308 03:11:47.463151 7648 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:11:47.463228 master-0 kubenswrapper[7648]: W0308 03:11:47.463218 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:11:47.463291 master-0 kubenswrapper[7648]: W0308 03:11:47.463281 7648 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:11:47.463358 master-0 kubenswrapper[7648]: W0308 03:11:47.463347 7648 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:11:47.463418 master-0 kubenswrapper[7648]: W0308 03:11:47.463408 7648 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:11:47.463481 master-0 kubenswrapper[7648]: W0308 03:11:47.463470 7648 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:11:47.463569 master-0 kubenswrapper[7648]: W0308 03:11:47.463558 7648 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:11:47.463632 master-0 kubenswrapper[7648]: W0308 03:11:47.463622 7648 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:11:47.463694 master-0 kubenswrapper[7648]: W0308 03:11:47.463684 7648 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:11:47.463763 master-0 kubenswrapper[7648]: W0308 03:11:47.463753 7648 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:11:47.463824 master-0 kubenswrapper[7648]: W0308 03:11:47.463814 7648 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:11:47.463886 master-0 kubenswrapper[7648]: W0308 03:11:47.463876 7648 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:11:47.463948 master-0 kubenswrapper[7648]: W0308 03:11:47.463937 7648 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:11:47.464010 master-0 kubenswrapper[7648]: W0308 03:11:47.463999 7648 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:11:47.464071 master-0 kubenswrapper[7648]: W0308 03:11:47.464061 7648 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:11:47.464134 master-0 kubenswrapper[7648]: W0308 03:11:47.464124 7648 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:11:47.464281 master-0 kubenswrapper[7648]: W0308 03:11:47.464270 7648 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:11:47.464390 master-0 kubenswrapper[7648]: W0308 03:11:47.464379 7648 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:11:47.464454 master-0 kubenswrapper[7648]: W0308 03:11:47.464444 7648 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:11:47.464540 master-0 kubenswrapper[7648]: W0308 03:11:47.464529 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:11:47.464613 master-0 kubenswrapper[7648]: W0308 03:11:47.464602 7648 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:11:47.464677 master-0 kubenswrapper[7648]: W0308 03:11:47.464667 7648 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:11:47.464739 master-0 kubenswrapper[7648]: W0308 03:11:47.464729 7648 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:11:47.464806 master-0 kubenswrapper[7648]: W0308 03:11:47.464795 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:11:47.464869 master-0 kubenswrapper[7648]: W0308 03:11:47.464859 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:11:47.464934 master-0 kubenswrapper[7648]: W0308 03:11:47.464924 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:11:47.464995 master-0 kubenswrapper[7648]: W0308 03:11:47.464985 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:11:47.465059 master-0 kubenswrapper[7648]: W0308 03:11:47.465048 7648 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:11:47.465115 master-0 kubenswrapper[7648]: W0308 03:11:47.465106 7648 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:11:47.465177 master-0 kubenswrapper[7648]: W0308 03:11:47.465167 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:11:47.465244 master-0 kubenswrapper[7648]: W0308 03:11:47.465234 7648 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:11:47.465305 master-0 kubenswrapper[7648]: W0308 03:11:47.465295 7648 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:11:47.465366 master-0 kubenswrapper[7648]: W0308 03:11:47.465356 7648 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:11:47.465422 master-0 kubenswrapper[7648]: W0308 03:11:47.465413 7648 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:11:47.465501 master-0 kubenswrapper[7648]: W0308 03:11:47.465473 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:11:47.465573 master-0 kubenswrapper[7648]: W0308 03:11:47.465562 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:11:47.465636 master-0 kubenswrapper[7648]: W0308 03:11:47.465625 7648 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:11:47.465704 master-0 kubenswrapper[7648]: W0308 03:11:47.465694 7648 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:11:47.465891 master-0 kubenswrapper[7648]: I0308 03:11:47.465871 7648 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:11:47.465984 master-0 kubenswrapper[7648]: I0308 03:11:47.465967 7648 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:11:47.466071 master-0 kubenswrapper[7648]: I0308 03:11:47.466057 7648 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:11:47.466139 master-0 kubenswrapper[7648]: I0308 03:11:47.466126 7648 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:11:47.466204 master-0 kubenswrapper[7648]: I0308 03:11:47.466193 7648 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:11:47.466275 master-0 kubenswrapper[7648]: I0308 03:11:47.466262 7648 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:11:47.466343 master-0 kubenswrapper[7648]: I0308 03:11:47.466330 7648 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:11:47.466408 master-0 kubenswrapper[7648]: I0308 03:11:47.466396 7648 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:11:47.466471 master-0 kubenswrapper[7648]: I0308 03:11:47.466460 7648 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:11:47.466568 master-0 kubenswrapper[7648]: I0308 03:11:47.466552 7648 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:11:47.466658 master-0 kubenswrapper[7648]: I0308 03:11:47.466642 7648 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:11:47.466734 master-0 kubenswrapper[7648]: I0308 03:11:47.466721 7648 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:11:47.466826 master-0 kubenswrapper[7648]: I0308 03:11:47.466810 7648 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:11:47.466920 master-0 kubenswrapper[7648]: I0308 03:11:47.466904 7648 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:11:47.467033 master-0 kubenswrapper[7648]: I0308 03:11:47.467015 7648 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:11:47.467126 master-0 kubenswrapper[7648]: I0308 03:11:47.467111 7648 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:11:47.467215 master-0 kubenswrapper[7648]: I0308 03:11:47.467199 7648 flags.go:64] FLAG: --cloud-config="" Mar 08 03:11:47.467303 master-0 kubenswrapper[7648]: I0308 03:11:47.467288 7648 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:11:47.467391 master-0 kubenswrapper[7648]: I0308 03:11:47.467371 7648 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:11:47.467514 master-0 kubenswrapper[7648]: I0308 03:11:47.467471 7648 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:11:47.467626 master-0 kubenswrapper[7648]: I0308 03:11:47.467609 7648 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:11:47.467721 master-0 kubenswrapper[7648]: I0308 03:11:47.467705 7648 flags.go:64] FLAG: --config-dir="" Mar 08 03:11:47.467817 master-0 kubenswrapper[7648]: I0308 03:11:47.467801 7648 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:11:47.467911 master-0 kubenswrapper[7648]: I0308 03:11:47.467893 7648 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:11:47.467993 master-0 kubenswrapper[7648]: I0308 03:11:47.467980 7648 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:11:47.468063 master-0 kubenswrapper[7648]: I0308 03:11:47.468051 7648 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:11:47.468130 master-0 kubenswrapper[7648]: I0308 03:11:47.468118 7648 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:11:47.468200 master-0 kubenswrapper[7648]: I0308 03:11:47.468189 7648 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:11:47.468270 master-0 kubenswrapper[7648]: I0308 03:11:47.468259 7648 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:11:47.468397 master-0 kubenswrapper[7648]: I0308 03:11:47.468384 7648 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:11:47.468465 master-0 kubenswrapper[7648]: I0308 03:11:47.468453 7648 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:11:47.468572 master-0 kubenswrapper[7648]: I0308 03:11:47.468557 7648 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:11:47.468652 master-0 kubenswrapper[7648]: I0308 03:11:47.468632 7648 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:11:47.468719 master-0 kubenswrapper[7648]: I0308 03:11:47.468708 7648 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:11:47.468789 master-0 kubenswrapper[7648]: I0308 03:11:47.468778 7648 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:11:47.468932 master-0 kubenswrapper[7648]: I0308 03:11:47.468919 7648 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:11:47.469002 master-0 kubenswrapper[7648]: I0308 03:11:47.468990 7648 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:11:47.469067 master-0 kubenswrapper[7648]: I0308 03:11:47.469056 7648 flags.go:64] FLAG: --enable-server="true" Mar 08 03:11:47.469130 master-0 kubenswrapper[7648]: I0308 03:11:47.469116 7648 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:11:47.469202 master-0 kubenswrapper[7648]: I0308 03:11:47.469190 7648 flags.go:64] FLAG: --event-burst="100" Mar 08 03:11:47.469277 master-0 kubenswrapper[7648]: I0308 03:11:47.469265 7648 flags.go:64] FLAG: --event-qps="50" Mar 08 03:11:47.469343 master-0 kubenswrapper[7648]: I0308 03:11:47.469332 7648 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:11:47.469409 master-0 kubenswrapper[7648]: I0308 03:11:47.469397 7648 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:11:47.469475 master-0 kubenswrapper[7648]: I0308 03:11:47.469462 7648 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:11:47.469570 master-0 kubenswrapper[7648]: I0308 03:11:47.469557 7648 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:11:47.469647 master-0 kubenswrapper[7648]: I0308 03:11:47.469635 7648 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:11:47.469723 master-0 kubenswrapper[7648]: I0308 03:11:47.469711 7648 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:11:47.469790 master-0 kubenswrapper[7648]: I0308 03:11:47.469779 7648 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:11:47.469855 master-0 kubenswrapper[7648]: I0308 03:11:47.469844 7648 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:11:47.469919 master-0 kubenswrapper[7648]: I0308 03:11:47.469908 7648 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:11:47.469987 master-0 kubenswrapper[7648]: I0308 03:11:47.469975 7648 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:11:47.470053 master-0 kubenswrapper[7648]: I0308 03:11:47.470041 7648 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:11:47.470117 master-0 kubenswrapper[7648]: I0308 03:11:47.470106 7648 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:11:47.470182 master-0 kubenswrapper[7648]: I0308 03:11:47.470170 7648 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:11:47.470253 master-0 kubenswrapper[7648]: I0308 03:11:47.470240 7648 flags.go:64] FLAG: --feature-gates="" Mar 08 03:11:47.470327 master-0 kubenswrapper[7648]: I0308 03:11:47.470316 7648 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:11:47.470395 master-0 kubenswrapper[7648]: I0308 03:11:47.470383 7648 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:11:47.470462 master-0 kubenswrapper[7648]: I0308 03:11:47.470451 7648 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:11:47.470565 master-0 kubenswrapper[7648]: I0308 03:11:47.470550 7648 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:11:47.470641 master-0 kubenswrapper[7648]: I0308 03:11:47.470629 7648 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:11:47.470702 master-0 kubenswrapper[7648]: I0308 03:11:47.470690 7648 flags.go:64] FLAG: --help="false" Mar 08 03:11:47.470771 master-0 kubenswrapper[7648]: I0308 03:11:47.470760 7648 flags.go:64] FLAG: --hostname-override="" Mar 08 03:11:47.470835 master-0 kubenswrapper[7648]: I0308 03:11:47.470824 7648 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:11:47.470896 master-0 kubenswrapper[7648]: I0308 03:11:47.470884 7648 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:11:47.470961 master-0 kubenswrapper[7648]: I0308 03:11:47.470950 7648 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:11:47.471045 master-0 kubenswrapper[7648]: I0308 03:11:47.471033 7648 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:11:47.471118 master-0 kubenswrapper[7648]: I0308 03:11:47.471107 7648 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:11:47.471187 master-0 kubenswrapper[7648]: I0308 03:11:47.471176 7648 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:11:47.471255 master-0 kubenswrapper[7648]: I0308 03:11:47.471244 7648 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:11:47.471321 master-0 kubenswrapper[7648]: I0308 03:11:47.471310 7648 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:11:47.471386 master-0 kubenswrapper[7648]: I0308 03:11:47.471374 7648 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:11:47.471456 master-0 kubenswrapper[7648]: I0308 03:11:47.471444 7648 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:11:47.471571 master-0 kubenswrapper[7648]: I0308 03:11:47.471558 7648 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:11:47.471652 master-0 kubenswrapper[7648]: I0308 03:11:47.471637 7648 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:11:47.471728 master-0 kubenswrapper[7648]: I0308 03:11:47.471715 7648 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:11:47.471820 master-0 kubenswrapper[7648]: I0308 03:11:47.471807 7648 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:11:47.471903 master-0 kubenswrapper[7648]: I0308 03:11:47.471890 7648 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:11:47.471972 master-0 kubenswrapper[7648]: I0308 03:11:47.471961 7648 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:11:47.472038 master-0 kubenswrapper[7648]: I0308 03:11:47.472027 7648 flags.go:64] FLAG: --lock-file="" Mar 08 03:11:47.472102 master-0 kubenswrapper[7648]: I0308 03:11:47.472091 7648 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:11:47.472167 master-0 kubenswrapper[7648]: I0308 03:11:47.472156 7648 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:11:47.472235 master-0 kubenswrapper[7648]: I0308 03:11:47.472220 7648 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:11:47.472305 master-0 kubenswrapper[7648]: I0308 03:11:47.472293 7648 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:11:47.472370 master-0 kubenswrapper[7648]: I0308 03:11:47.472359 7648 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:11:47.472431 master-0 kubenswrapper[7648]: I0308 03:11:47.472419 7648 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:11:47.472536 master-0 kubenswrapper[7648]: I0308 03:11:47.472520 7648 flags.go:64] FLAG: --logging-format="text" Mar 08 03:11:47.472622 master-0 kubenswrapper[7648]: I0308 03:11:47.472610 7648 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:11:47.472694 master-0 kubenswrapper[7648]: I0308 03:11:47.472683 7648 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:11:47.472760 master-0 kubenswrapper[7648]: I0308 03:11:47.472749 7648 flags.go:64] FLAG: --manifest-url="" Mar 08 03:11:47.472829 master-0 kubenswrapper[7648]: I0308 03:11:47.472814 7648 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:11:47.472900 master-0 kubenswrapper[7648]: I0308 03:11:47.472887 7648 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:11:47.472969 master-0 kubenswrapper[7648]: I0308 03:11:47.472956 7648 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:11:47.473030 master-0 kubenswrapper[7648]: I0308 03:11:47.473019 7648 flags.go:64] FLAG: --max-pods="110" Mar 08 03:11:47.473096 master-0 kubenswrapper[7648]: I0308 03:11:47.473085 7648 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:11:47.473163 master-0 kubenswrapper[7648]: I0308 03:11:47.473152 7648 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:11:47.473228 master-0 kubenswrapper[7648]: I0308 03:11:47.473217 7648 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:11:47.473293 master-0 kubenswrapper[7648]: I0308 03:11:47.473281 7648 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:11:47.473364 master-0 kubenswrapper[7648]: I0308 03:11:47.473352 7648 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:11:47.473426 master-0 kubenswrapper[7648]: I0308 03:11:47.473415 7648 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:11:47.473533 master-0 kubenswrapper[7648]: I0308 03:11:47.473506 7648 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:11:47.473623 master-0 kubenswrapper[7648]: I0308 03:11:47.473610 7648 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:11:47.473693 master-0 kubenswrapper[7648]: I0308 03:11:47.473681 7648 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:11:47.473793 master-0 kubenswrapper[7648]: I0308 03:11:47.473747 7648 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:11:47.473874 master-0 kubenswrapper[7648]: I0308 03:11:47.473861 7648 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:11:47.473948 master-0 kubenswrapper[7648]: I0308 03:11:47.473934 7648 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:11:47.474010 master-0 kubenswrapper[7648]: I0308 03:11:47.473999 7648 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:11:47.474075 master-0 kubenswrapper[7648]: I0308 03:11:47.474064 7648 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:11:47.474143 master-0 kubenswrapper[7648]: I0308 03:11:47.474130 7648 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:11:47.474216 master-0 kubenswrapper[7648]: I0308 03:11:47.474204 7648 flags.go:64] FLAG: --port="10250" Mar 08 03:11:47.474281 master-0 kubenswrapper[7648]: I0308 03:11:47.474269 7648 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:11:47.474345 master-0 kubenswrapper[7648]: I0308 03:11:47.474334 7648 flags.go:64] FLAG: --provider-id="" Mar 08 03:11:47.474416 master-0 kubenswrapper[7648]: I0308 03:11:47.474404 7648 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:11:47.474505 master-0 kubenswrapper[7648]: I0308 03:11:47.474471 7648 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:11:47.474598 master-0 kubenswrapper[7648]: I0308 03:11:47.474583 7648 flags.go:64] FLAG: --register-node="true" Mar 08 03:11:47.474673 master-0 kubenswrapper[7648]: I0308 03:11:47.474659 7648 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:11:47.474752 master-0 kubenswrapper[7648]: I0308 03:11:47.474735 7648 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:11:47.474839 master-0 kubenswrapper[7648]: I0308 03:11:47.474825 7648 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:11:47.474920 master-0 kubenswrapper[7648]: I0308 03:11:47.474905 7648 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475022 7648 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475035 7648 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475043 7648 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475050 7648 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475056 7648 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475063 7648 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475070 7648 flags.go:64] FLAG: --runonce="false" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475077 7648 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475085 7648 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475093 7648 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475100 7648 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475108 7648 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475115 7648 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475122 7648 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475130 7648 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475137 7648 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475145 7648 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475152 7648 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475159 7648 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475167 7648 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475174 7648 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475182 7648 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475194 7648 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475201 7648 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:11:47.476538 master-0 kubenswrapper[7648]: I0308 03:11:47.475208 7648 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475220 7648 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475229 7648 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475237 7648 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475244 7648 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475254 7648 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475261 7648 flags.go:64] FLAG: --v="2" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475271 7648 flags.go:64] FLAG: --version="false" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475280 7648 flags.go:64] FLAG: --vmodule="" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475289 7648 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: I0308 03:11:47.475297 7648 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475476 7648 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475512 7648 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475520 7648 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475527 7648 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475534 7648 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475542 7648 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475550 7648 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475558 7648 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475566 7648 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475573 7648 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475580 7648 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:11:47.477328 master-0 kubenswrapper[7648]: W0308 03:11:47.475586 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475593 7648 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475601 7648 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475609 7648 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475616 7648 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475622 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475629 7648 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475635 7648 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475641 7648 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475648 7648 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475654 7648 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475660 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475667 7648 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475674 7648 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475680 7648 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475689 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475695 7648 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475701 7648 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475707 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475713 7648 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:11:47.477926 master-0 kubenswrapper[7648]: W0308 03:11:47.475720 7648 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475726 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475732 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475738 7648 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475744 7648 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475750 7648 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475757 7648 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475763 7648 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475769 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475775 7648 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475782 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475788 7648 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475794 7648 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475824 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475833 7648 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475840 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475847 7648 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475853 7648 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475860 7648 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475867 7648 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:11:47.478387 master-0 kubenswrapper[7648]: W0308 03:11:47.475874 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475881 7648 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475888 7648 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475895 7648 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475904 7648 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475910 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475918 7648 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475924 7648 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475931 7648 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475939 7648 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475947 7648 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475956 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475963 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475969 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475978 7648 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475986 7648 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.475994 7648 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.476001 7648 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.476008 7648 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:11:47.478897 master-0 kubenswrapper[7648]: W0308 03:11:47.476014 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:11:47.479391 master-0 kubenswrapper[7648]: W0308 03:11:47.476021 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:11:47.479391 master-0 kubenswrapper[7648]: I0308 03:11:47.476043 7648 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:11:47.486251 master-0 kubenswrapper[7648]: I0308 03:11:47.486087 7648 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:11:47.486406 master-0 kubenswrapper[7648]: I0308 03:11:47.486385 7648 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:11:47.486598 master-0 kubenswrapper[7648]: W0308 03:11:47.486576 7648 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:11:47.486725 master-0 kubenswrapper[7648]: W0308 03:11:47.486702 7648 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:11:47.486762 master-0 kubenswrapper[7648]: W0308 03:11:47.486725 7648 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:11:47.486762 master-0 kubenswrapper[7648]: W0308 03:11:47.486735 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:11:47.486977 master-0 kubenswrapper[7648]: W0308 03:11:47.486921 7648 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.486977 7648 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.486987 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487024 7648 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487037 7648 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487049 7648 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487058 7648 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487067 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:11:47.487075 master-0 kubenswrapper[7648]: W0308 03:11:47.487077 7648 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487088 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487100 7648 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487111 7648 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487121 7648 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487190 7648 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487202 7648 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487213 7648 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487222 7648 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487229 7648 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487237 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487245 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487254 7648 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487262 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487270 7648 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487278 7648 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487285 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487293 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487301 7648 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487310 7648 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:11:47.487346 master-0 kubenswrapper[7648]: W0308 03:11:47.487319 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487329 7648 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487337 7648 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487345 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487356 7648 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487365 7648 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487374 7648 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487383 7648 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487391 7648 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487399 7648 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487407 7648 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487415 7648 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487423 7648 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487432 7648 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487440 7648 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487452 7648 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487465 7648 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487478 7648 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487515 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:11:47.487825 master-0 kubenswrapper[7648]: W0308 03:11:47.487528 7648 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487539 7648 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487550 7648 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487560 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487570 7648 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487579 7648 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487586 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487597 7648 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487605 7648 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487613 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487621 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487629 7648 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487645 7648 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487654 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487662 7648 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487670 7648 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487678 7648 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487686 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487693 7648 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487702 7648 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:11:47.488263 master-0 kubenswrapper[7648]: W0308 03:11:47.487711 7648 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: I0308 03:11:47.487725 7648 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488001 7648 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488016 7648 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488025 7648 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488034 7648 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488043 7648 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488051 7648 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488060 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488068 7648 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488077 7648 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488085 7648 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488093 7648 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488100 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488108 7648 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:11:47.488844 master-0 kubenswrapper[7648]: W0308 03:11:47.488116 7648 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488127 7648 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488137 7648 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488146 7648 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488221 7648 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488233 7648 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488245 7648 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488255 7648 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488313 7648 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488328 7648 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488340 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488356 7648 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488409 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488422 7648 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488430 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488438 7648 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488447 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488456 7648 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488520 7648 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:11:47.489183 master-0 kubenswrapper[7648]: W0308 03:11:47.488532 7648 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488544 7648 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488552 7648 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488561 7648 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488570 7648 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488612 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488622 7648 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488658 7648 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488668 7648 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488712 7648 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488720 7648 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.488728 7648 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.489174 7648 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.489190 7648 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.489199 7648 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:11:47.489686 master-0 kubenswrapper[7648]: W0308 03:11:47.489208 7648 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489676 7648 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489738 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489747 7648 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489761 7648 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489777 7648 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.489989 7648 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.490006 7648 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.490017 7648 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:11:47.490031 master-0 kubenswrapper[7648]: W0308 03:11:47.490028 7648 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490042 7648 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490053 7648 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490064 7648 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490074 7648 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490084 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490095 7648 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490105 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490115 7648 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490126 7648 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490136 7648 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490147 7648 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490156 7648 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490166 7648 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490175 7648 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:11:47.490241 master-0 kubenswrapper[7648]: W0308 03:11:47.490185 7648 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:11:47.490604 master-0 kubenswrapper[7648]: I0308 03:11:47.490201 7648 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:11:47.490713 master-0 kubenswrapper[7648]: I0308 03:11:47.490687 7648 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:11:47.496985 master-0 kubenswrapper[7648]: I0308 03:11:47.496948 7648 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 03:11:47.497100 master-0 kubenswrapper[7648]: I0308 03:11:47.497077 7648 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 03:11:47.497584 master-0 kubenswrapper[7648]: I0308 03:11:47.497553 7648 server.go:997] "Starting client certificate rotation" Mar 08 03:11:47.497616 master-0 kubenswrapper[7648]: I0308 03:11:47.497581 7648 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:11:47.497912 master-0 kubenswrapper[7648]: I0308 03:11:47.497768 7648 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 20:02:48.8380025 +0000 UTC Mar 08 03:11:47.497912 master-0 kubenswrapper[7648]: I0308 03:11:47.497907 7648 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 16h51m1.34009866s for next certificate rotation Mar 08 03:11:47.498971 master-0 kubenswrapper[7648]: I0308 03:11:47.498921 7648 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:11:47.502421 master-0 kubenswrapper[7648]: I0308 03:11:47.502317 7648 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:11:47.508046 master-0 kubenswrapper[7648]: I0308 03:11:47.508015 7648 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:11:47.511928 master-0 kubenswrapper[7648]: I0308 03:11:47.511888 7648 log.go:25] "Validated CRI v1 image API" Mar 08 03:11:47.513745 master-0 kubenswrapper[7648]: I0308 03:11:47.513711 7648 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:11:47.519351 master-0 kubenswrapper[7648]: I0308 03:11:47.519305 7648 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 87790f63-c01f-464b-b8aa-2380aaf22629:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:11:47.519729 master-0 kubenswrapper[7648]: I0308 03:11:47.519344 7648 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm major:0 minor:237 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm major:0 minor:274 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm major:0 minor:239 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm major:0 minor:146 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc:{mountpoint:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq:{mountpoint:/var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t:{mountpoint:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt:{mountpoint:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x:{mountpoint:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll:{mountpoint:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz:{mountpoint:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m:{mountpoint:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l:{mountpoint:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l major:0 minor:269 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr:{mountpoint:/var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl:{mountpoint:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp:{mountpoint:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9:{mountpoint:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx:{mountpoint:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z:{mountpoint:/var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt:{mountpoint:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56:{mountpoint:/var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g:{mountpoint:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p:{mountpoint:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c4af87e2-50c3-4d08-9326-9c8876a6fd7b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/c4af87e2-50c3-4d08-9326-9c8876a6fd7b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9 major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2:{mountpoint:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj:{mountpoint:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk:{mountpoint:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd:{mountpoint:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk:{mountpoint:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd:{mountpoint:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b:{mountpoint:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b major:0 minor:271 fsType:tmpfs blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/40f9c2974ba036bb89940573af144b4586eefd5a93302f118f7815601d3c098e/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/0ec0fd29f070a982a3dfdc9df1c869c587f3154e6892e7b827688bd9a2bd32fe/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/62cbb1f9d29467e71689d3addcba69f073875fef18626a41132714c1a632b40a/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/69064194d38ddc5a84df392c6bccb6ce3c03f6a181ac1ee02f37449076b0fc8c/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/9eaa7f012506103b2e53795b45cf0bd9ca70f21270befca025803fd7739ce195/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/ab995380e3629db2698087e0aa28a4c5d6aba1208e4eb7c35ea1e0b10ec5f7a1/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/0db3042d052a37916d37142d030b052b85b4b48a2bb22499ae93770cb309dc12/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/f43709de24dee67b88fa62b9f9bc8ec4406cc244e1aa7b4ad1c196f255d7140b/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/e79b7218e6d5c34d1973fbfa72f1dc2fbd3e3374ea434a75590521f4935d2531/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/370cde7eafac098b6a9275153bc67d59fc782c0ca1d678379d115b2992ca7926/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/34f25d2541ccbc258036b532c478586d49916d7689e754bfb2036f77cdb6baaf/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/b4fba388b6f295d3fe277edfb579ced229f3f14e9563179e7c1d33f118008bf6/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/e872d2b18ddd96ff2f6342448cc9a15aef1b8ff55484f89addee75734f2e1c78/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/2233d9fe74718ab0ae53f5a512c7e547f9d981a9d9114a9700ec31c0933e09b7/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/57a9bfbbbe06e8741b7f71acad7d67f57133e3855842399a07580e12cdf86e5f/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-173:{mountpoint:/var/lib/containers/storage/overlay/937deb5ccbc384a4b4fcd24fdb4ab2a310e104b118773e5458288eb1635b4cc9/merged major:0 minor:173 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/0493015aad0c37f1cf949671a0cd243c66517f1b7801835d988f52a0b4ebaf1f/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/d943d212aa3c805f0ee86ceb2aa1bac6052e1648d2966b3847e65ee807bc9427/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/b186b114d84ebb5d81155ab25a48687fed4904a89bc31cdad5486e28d5195171/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/256c05ea6649d1c6f82ba731be228111eae7455dc21c3a922778e8c4bb9728b6/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/51fc7ff6a4fa287c151dd6eb373813b8ab6fc82de69197375ecd1e21813c9166/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/ce27efa5236d15ebaf2f519800b99cfccf04bb89dedaec7f192fc22145dc0145/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-267:{mountpoint:/var/lib/containers/storage/overlay/d4c637eca45faf4dc5ae119273f6ca19e0b1e41ff9586ceaad7b96e492c21fb1/merged major:0 minor:267 fsType:overlay blockSize:0} overlay_0-278:{mountpoint:/var/lib/containers/storage/overlay/0177399de96c59f0d082a83f3c99d8728cd24f4354c84606115f9c7e387fd0bb/merged major:0 minor:278 fsType:overlay blockSize:0} overlay_0-280:{mountpoint:/var/lib/containers/storage/overlay/dd1e4d86d4bb5590e6222601731cda85d2ab1a42246bb905777d795449ce11f2/merged major:0 minor:280 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/24e1c524b8242cfe6c6a73ec89989cbd20f616c0f238868827c96d17d61df07a/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-284:{mountpoint:/var/lib/containers/storage/overlay/aa3e99c07231324870aab48ddf6ac3aa5f979f3c188a21bf2cb09ebc807ce3c6/merged major:0 minor:284 fsType:overlay blockSize:0} overlay_0-286:{mountpoint:/var/lib/containers/storage/overlay/4cc9f1601463d8cf563ff2927f37ef4c9df53ca94bb18a047a7185e9e2c4b8ec/merged major:0 minor:286 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/a278a123654ccabfdecc16f667b23335fe34957bb6fa95ed755c6d6f2c25fc45/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/39925a702f39205a7d286aea1120e37a159eb7ea9547eb35c47107156a92beac/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/21bdcaa8c3c2b7bc82e63b75481269ac7050ce80c702009229ad24a08417b1de/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/3a80bde5f6d3bb761fc678109cf7e0321500b962b59e20991bea9b87c9343b7c/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-296:{mountpoint:/var/lib/containers/storage/overlay/c36afb1a7d3d04e0a4e3111b86b4e4a854597f2e1301c7de09cc2fcce3f87ac6/merged major:0 minor:296 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/e6230d35a24902da19a64d730c31efec5bf9088d8973d924e54531764da0630f/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/e344abc9ec0b8ea68650286cf106e964370084c9106df9e89c37ad2bd15e3630/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/72b3f01c53e382b2bce7d4b7d0197edc22431f3ef65a4864c8918ecb3047d239/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/35db7fa1165e5c9f94f7cac5976707e806554c5f00ac85fae6a7d277e54e0820/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/6334d65302c4b77c66483cd075c39ee4fe79e3caac4ea3ce4c41b95b06d1a1f5/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/ed0fa5573540caa0e03989916ac6b00a32ad2c42ab40751134860e1c42482ec8/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/c00f160b816e259221575d5a241ced88af0ea2cf9db59987705cf0d6ee2ff5ae/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/9d8b0598073bcc8d7331bacd1e5a2ffb09102eb7681db04d22c543f4ccfca5e8/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/bf04e15213a802a01f63b27a1293e16aab99152bfe41da3d1385e37faa966443/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/9fe893d7c61e44ea63eccffb887b389ae0f901d5e4789f316f225e753a6cd8bb/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/4646bdca662eef4647809339939b4cac973f68fa4cc60f42be0a678ee9853dc8/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/9e8436bc92a8a3296af5164d4c9d05a8c1c72d3e686484931070b5ac536d86e7/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/589d7a59825439a13f501df8dd29e9b1d1eecbd0b2884974edf6dcfe766b9c7f/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/5b6715530a2fde1e11e0d1403a69f4334d7c47efcaa06fda3442cd961e7c0b73/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/f36212582574ef3ce069bc93da0f99ab8002dc69ce7e5fa6c54379d6c9b0166d/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 08 03:11:47.545851 master-0 kubenswrapper[7648]: I0308 03:11:47.545090 7648 manager.go:217] Machine: {Timestamp:2026-03-08 03:11:47.544080644 +0000 UTC m=+0.155398974 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b3a4f7075cb34fef92c3bca0876fb6a9 SystemUUID:b3a4f707-5cb3-4fef-92c3-bca0876fb6a9 BootID:ab1d3f01-9ab7-4687-a25d-e07ad2358a90 Filesystems:[{Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp DeviceMajor:0 DeviceMinor:222 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll DeviceMajor:0 DeviceMinor:224 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:127 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-267 DeviceMajor:0 DeviceMinor:267 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-284 DeviceMajor:0 DeviceMinor:284 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr DeviceMajor:0 DeviceMinor:115 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9 DeviceMajor:0 DeviceMinor:225 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w DeviceMajor:0 DeviceMinor:229 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56 DeviceMajor:0 DeviceMinor:246 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l DeviceMajor:0 DeviceMinor:269 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl DeviceMajor:0 DeviceMinor:123 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9 DeviceMajor:0 DeviceMinor:126 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd DeviceMajor:0 DeviceMinor:233 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk DeviceMajor:0 DeviceMinor:249 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:260 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm DeviceMajor:0 DeviceMinor:237 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-286 DeviceMajor:0 DeviceMinor:286 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx DeviceMajor:0 DeviceMinor:98 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz DeviceMajor:0 DeviceMinor:231 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:234 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m DeviceMajor:0 DeviceMinor:241 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-280 DeviceMajor:0 DeviceMinor:280 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-278 DeviceMajor:0 DeviceMinor:278 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt DeviceMajor:0 DeviceMinor:223 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk DeviceMajor:0 DeviceMinor:258 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc DeviceMajor:0 DeviceMinor:250 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:230 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:228 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:243 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq DeviceMajor:0 DeviceMinor:263 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x DeviceMajor:0 DeviceMinor:255 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/c4af87e2-50c3-4d08-9326-9c8876a6fd7b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:94 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt DeviceMajor:0 DeviceMinor:139 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj DeviceMajor:0 DeviceMinor:235 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-296 DeviceMajor:0 DeviceMinor:296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm DeviceMajor:0 DeviceMinor:239 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm DeviceMajor:0 DeviceMinor:146 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p DeviceMajor:0 DeviceMinor:226 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-173 DeviceMajor:0 DeviceMinor:173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b DeviceMajor:0 DeviceMinor:271 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t DeviceMajor:0 DeviceMinor:232 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g DeviceMajor:0 DeviceMinor:270 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z DeviceMajor:0 DeviceMinor:103 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm DeviceMajor:0 DeviceMinor:274 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2 DeviceMajor:0 DeviceMinor:125 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd DeviceMajor:0 DeviceMinor:227 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:11e59ac64a0deba MacAddress:82:bf:78:a7:94:02 Speed:10000 Mtu:8900} {Name:23bb11ba3a23ff0 MacAddress:6a:d8:b1:44:5d:e1 Speed:10000 Mtu:8900} {Name:47b9c8e39f771f4 MacAddress:8a:45:9a:08:24:e8 Speed:10000 Mtu:8900} {Name:6c725404c21fc1f MacAddress:7e:e9:05:3d:81:17 Speed:10000 Mtu:8900} {Name:755ba253608e09c MacAddress:56:48:6c:c6:85:af Speed:10000 Mtu:8900} {Name:92d33cd4d391db4 MacAddress:ce:d2:c7:63:cf:fe Speed:10000 Mtu:8900} {Name:9b8daf0d86e5f84 MacAddress:22:19:ce:7e:45:75 Speed:10000 Mtu:8900} {Name:a9677e44cf88488 MacAddress:5e:8c:9d:84:8f:c8 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:76:1e:7d:2a:6e:ef Speed:0 Mtu:8900} {Name:ca93e52234c8836 MacAddress:2e:21:79:13:42:36 Speed:10000 Mtu:8900} {Name:e51dfab7b748272 MacAddress:9e:05:42:95:de:68 Speed:10000 Mtu:8900} {Name:ee50d0908ef8218 MacAddress:de:9a:b9:4b:8f:d3 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3a:b1:eb Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:76:56:79 Speed:-1 Mtu:9000} {Name:f94d3d5f4c829c1 MacAddress:42:a4:49:ee:e9:73 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:06:cd:ac:c4:de:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:11:47.545851 master-0 kubenswrapper[7648]: I0308 03:11:47.545833 7648 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:11:47.546197 master-0 kubenswrapper[7648]: I0308 03:11:47.546002 7648 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:11:47.546320 master-0 kubenswrapper[7648]: I0308 03:11:47.546293 7648 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:11:47.546591 master-0 kubenswrapper[7648]: I0308 03:11:47.546537 7648 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:11:47.546837 master-0 kubenswrapper[7648]: I0308 03:11:47.546573 7648 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:11:47.546886 master-0 kubenswrapper[7648]: I0308 03:11:47.546862 7648 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:11:47.546886 master-0 kubenswrapper[7648]: I0308 03:11:47.546875 7648 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:11:47.546946 master-0 kubenswrapper[7648]: I0308 03:11:47.546886 7648 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:11:47.546946 master-0 kubenswrapper[7648]: I0308 03:11:47.546914 7648 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:11:47.547036 master-0 kubenswrapper[7648]: I0308 03:11:47.547019 7648 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:11:47.547165 master-0 kubenswrapper[7648]: I0308 03:11:47.547148 7648 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:11:47.547243 master-0 kubenswrapper[7648]: I0308 03:11:47.547227 7648 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:11:47.547274 master-0 kubenswrapper[7648]: I0308 03:11:47.547247 7648 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:11:47.547274 master-0 kubenswrapper[7648]: I0308 03:11:47.547264 7648 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:11:47.547388 master-0 kubenswrapper[7648]: I0308 03:11:47.547277 7648 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:11:47.547388 master-0 kubenswrapper[7648]: I0308 03:11:47.547292 7648 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:11:47.548249 master-0 kubenswrapper[7648]: I0308 03:11:47.548220 7648 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:11:47.548397 master-0 kubenswrapper[7648]: I0308 03:11:47.548372 7648 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 03:11:47.548723 master-0 kubenswrapper[7648]: I0308 03:11:47.548696 7648 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:11:47.548867 master-0 kubenswrapper[7648]: I0308 03:11:47.548844 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:11:47.548897 master-0 kubenswrapper[7648]: I0308 03:11:47.548872 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:11:47.548897 master-0 kubenswrapper[7648]: I0308 03:11:47.548883 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:11:47.548897 master-0 kubenswrapper[7648]: I0308 03:11:47.548893 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548904 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548914 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548923 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548932 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548942 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:11:47.548978 master-0 kubenswrapper[7648]: I0308 03:11:47.548953 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:11:47.549217 master-0 kubenswrapper[7648]: I0308 03:11:47.548983 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:11:47.549217 master-0 kubenswrapper[7648]: I0308 03:11:47.548999 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:11:47.549404 master-0 kubenswrapper[7648]: I0308 03:11:47.549379 7648 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:11:47.549813 master-0 kubenswrapper[7648]: I0308 03:11:47.549785 7648 server.go:1280] "Started kubelet" Mar 08 03:11:47.550033 master-0 kubenswrapper[7648]: I0308 03:11:47.549985 7648 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:11:47.550363 master-0 kubenswrapper[7648]: I0308 03:11:47.550327 7648 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:11:47.550398 master-0 kubenswrapper[7648]: I0308 03:11:47.550383 7648 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:11:47.550669 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:11:47.562679 master-0 kubenswrapper[7648]: I0308 03:11:47.562642 7648 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:11:47.567210 master-0 kubenswrapper[7648]: I0308 03:11:47.567177 7648 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:11:47.569052 master-0 kubenswrapper[7648]: I0308 03:11:47.569025 7648 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:11:47.569097 master-0 kubenswrapper[7648]: I0308 03:11:47.569062 7648 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:11:47.569181 master-0 kubenswrapper[7648]: I0308 03:11:47.569137 7648 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-09 00:30:32.455971159 +0000 UTC Mar 08 03:11:47.569552 master-0 kubenswrapper[7648]: I0308 03:11:47.569525 7648 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 21h18m44.886449917s for next certificate rotation Mar 08 03:11:47.569552 master-0 kubenswrapper[7648]: I0308 03:11:47.569207 7648 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:11:47.569629 master-0 kubenswrapper[7648]: I0308 03:11:47.569558 7648 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:11:47.569629 master-0 kubenswrapper[7648]: E0308 03:11:47.569164 7648 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:11:47.569629 master-0 kubenswrapper[7648]: I0308 03:11:47.569229 7648 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:11:47.570443 master-0 kubenswrapper[7648]: I0308 03:11:47.570422 7648 factory.go:55] Registering systemd factory Mar 08 03:11:47.570443 master-0 kubenswrapper[7648]: I0308 03:11:47.570442 7648 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:11:47.570733 master-0 kubenswrapper[7648]: I0308 03:11:47.570699 7648 factory.go:153] Registering CRI-O factory Mar 08 03:11:47.570733 master-0 kubenswrapper[7648]: I0308 03:11:47.570726 7648 factory.go:221] Registration of the crio container factory successfully Mar 08 03:11:47.570818 master-0 kubenswrapper[7648]: I0308 03:11:47.570798 7648 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:11:47.570868 master-0 kubenswrapper[7648]: I0308 03:11:47.570828 7648 factory.go:103] Registering Raw factory Mar 08 03:11:47.570868 master-0 kubenswrapper[7648]: I0308 03:11:47.570844 7648 manager.go:1196] Started watching for new ooms in manager Mar 08 03:11:47.571323 master-0 kubenswrapper[7648]: I0308 03:11:47.571299 7648 manager.go:319] Starting recovery of all containers Mar 08 03:11:47.572421 master-0 kubenswrapper[7648]: I0308 03:11:47.572382 7648 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:11:47.574087 master-0 kubenswrapper[7648]: I0308 03:11:47.574011 7648 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:11:47.574373 master-0 kubenswrapper[7648]: I0308 03:11:47.574249 7648 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:11:47.578988 master-0 kubenswrapper[7648]: I0308 03:11:47.578907 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.578985 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.579008 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" volumeName="kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.579028 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.579047 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982ea338-c7be-4776-9bb7-113834c54aaa" volumeName="kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.579065 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config" seLinuxMountContext="" Mar 08 03:11:47.579087 master-0 kubenswrapper[7648]: I0308 03:11:47.579080 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579094 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eedc7538-9cc6-4bf5-9628-e278310d796b" volumeName="kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579140 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579154 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579169 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4108f513-acef-473a-ab03-f3761b2bd0d8" volumeName="kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579184 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579202 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579223 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982ea338-c7be-4776-9bb7-113834c54aaa" volumeName="kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579247 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579264 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579279 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579300 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579313 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d83aa242-606f-4adc-b689-4aa89625b533" volumeName="kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj" seLinuxMountContext="" Mar 08 03:11:47.579333 master-0 kubenswrapper[7648]: I0308 03:11:47.579332 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579349 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579364 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579384 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579402 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579420 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579433 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579455 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f99f81a-fd2d-432e-a3bc-e451342650b1" volumeName="kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579473 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides" seLinuxMountContext="" Mar 08 03:11:47.579895 master-0 kubenswrapper[7648]: I0308 03:11:47.579512 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy" seLinuxMountContext="" Mar 08 03:11:47.580197 master-0 kubenswrapper[7648]: I0308 03:11:47.579978 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" volumeName="kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl" seLinuxMountContext="" Mar 08 03:11:47.580636 master-0 kubenswrapper[7648]: I0308 03:11:47.580032 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy" seLinuxMountContext="" Mar 08 03:11:47.580678 master-0 kubenswrapper[7648]: I0308 03:11:47.580660 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f822854-b9ac-46f2-b03b-e7215fba9208" volumeName="kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll" seLinuxMountContext="" Mar 08 03:11:47.580730 master-0 kubenswrapper[7648]: I0308 03:11:47.580686 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config" seLinuxMountContext="" Mar 08 03:11:47.580730 master-0 kubenswrapper[7648]: I0308 03:11:47.580702 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 03:11:47.580793 master-0 kubenswrapper[7648]: I0308 03:11:47.580738 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca" seLinuxMountContext="" Mar 08 03:11:47.580793 master-0 kubenswrapper[7648]: I0308 03:11:47.580753 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib" seLinuxMountContext="" Mar 08 03:11:47.580793 master-0 kubenswrapper[7648]: I0308 03:11:47.580764 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk" seLinuxMountContext="" Mar 08 03:11:47.580793 master-0 kubenswrapper[7648]: I0308 03:11:47.580781 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580808 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580827 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="275be8d3-df30-46f7-9d0a-806e404dfd57" volumeName="kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580837 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580849 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580865 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g" seLinuxMountContext="" Mar 08 03:11:47.580896 master-0 kubenswrapper[7648]: I0308 03:11:47.580895 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581046 master-0 kubenswrapper[7648]: I0308 03:11:47.580908 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr" seLinuxMountContext="" Mar 08 03:11:47.581046 master-0 kubenswrapper[7648]: I0308 03:11:47.580926 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access" seLinuxMountContext="" Mar 08 03:11:47.581046 master-0 kubenswrapper[7648]: I0308 03:11:47.580940 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access" seLinuxMountContext="" Mar 08 03:11:47.581046 master-0 kubenswrapper[7648]: I0308 03:11:47.580975 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba9496ed-060e-4118-9da6-89b82bd49263" volumeName="kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581047 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581065 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581081 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="275be8d3-df30-46f7-9d0a-806e404dfd57" volumeName="kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581092 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581131 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581148 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config" seLinuxMountContext="" Mar 08 03:11:47.581166 master-0 kubenswrapper[7648]: I0308 03:11:47.581161 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581180 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581212 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581229 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581239 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" volumeName="kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581252 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581263 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eedc7538-9cc6-4bf5-9628-e278310d796b" volumeName="kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581292 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581306 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581319 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581339 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581371 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581382 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config" seLinuxMountContext="" Mar 08 03:11:47.581397 master-0 kubenswrapper[7648]: I0308 03:11:47.581396 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="23b66415-df37-4015-9a0c-69115b3a0739" volumeName="kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581413 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581443 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581457 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581470 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581505 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581516 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581530 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581545 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581555 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581586 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581597 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581619 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bbe9b81-0efb-4caa-bacd-55348cd392c6" volumeName="kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581634 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4108f513-acef-473a-ab03-f3761b2bd0d8" volumeName="kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581670 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581685 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581694 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581710 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581720 7648 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config" seLinuxMountContext="" Mar 08 03:11:47.581728 master-0 kubenswrapper[7648]: I0308 03:11:47.581747 7648 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:11:47.582257 master-0 kubenswrapper[7648]: I0308 03:11:47.581756 7648 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:11:47.585929 master-0 kubenswrapper[7648]: I0308 03:11:47.585909 7648 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 03:11:47.609689 master-0 kubenswrapper[7648]: I0308 03:11:47.609568 7648 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:11:47.611758 master-0 kubenswrapper[7648]: I0308 03:11:47.611733 7648 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:11:47.611804 master-0 kubenswrapper[7648]: I0308 03:11:47.611771 7648 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:11:47.611804 master-0 kubenswrapper[7648]: I0308 03:11:47.611793 7648 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:11:47.611939 master-0 kubenswrapper[7648]: E0308 03:11:47.611915 7648 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 03:11:47.613266 master-0 kubenswrapper[7648]: I0308 03:11:47.613231 7648 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:11:47.629794 master-0 kubenswrapper[7648]: I0308 03:11:47.629717 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="c1d3c31c196416ae00334f18b3e579542658be979ab39e41ffb430f787c5ee3e" exitCode=0 Mar 08 03:11:47.629794 master-0 kubenswrapper[7648]: I0308 03:11:47.629784 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="b3ea93aa98c6a855a072d3642fcdd00f5f7951231e2c2010a477ac7e3afcf009" exitCode=0 Mar 08 03:11:47.629794 master-0 kubenswrapper[7648]: I0308 03:11:47.629797 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="bf04118919c009f59ea3e84f16c295d8440cef4db850135663e4a2db1d87ef48" exitCode=0 Mar 08 03:11:47.630049 master-0 kubenswrapper[7648]: I0308 03:11:47.629806 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="db3a63a925785d6eff81f565afee5497f9a99d04d1c84187c3150ffb13b3defd" exitCode=0 Mar 08 03:11:47.630686 master-0 kubenswrapper[7648]: I0308 03:11:47.629883 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="2adae51f407e15a120ce855a4a69d4bbd243779881704875d67dd256bba0227a" exitCode=0 Mar 08 03:11:47.630727 master-0 kubenswrapper[7648]: I0308 03:11:47.630697 7648 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="4aa5fc291dd0b6e7ec288140975372ce39389e86edf66a268784556c20872aa9" exitCode=0 Mar 08 03:11:47.632978 master-0 kubenswrapper[7648]: I0308 03:11:47.632959 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:11:47.633305 master-0 kubenswrapper[7648]: I0308 03:11:47.633278 7648 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" exitCode=1 Mar 08 03:11:47.633305 master-0 kubenswrapper[7648]: I0308 03:11:47.633299 7648 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755" exitCode=0 Mar 08 03:11:47.639553 master-0 kubenswrapper[7648]: I0308 03:11:47.639285 7648 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42" exitCode=1 Mar 08 03:11:47.648818 master-0 kubenswrapper[7648]: I0308 03:11:47.648790 7648 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="d7c267c7d1ad40b10c4f9d19008802c751a1cdd3364f0744ee013a61bcad5ca6" exitCode=0 Mar 08 03:11:47.653391 master-0 kubenswrapper[7648]: I0308 03:11:47.653354 7648 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619" exitCode=0 Mar 08 03:11:47.655846 master-0 kubenswrapper[7648]: I0308 03:11:47.655805 7648 generic.go:334] "Generic (PLEG): container finished" podID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerID="00d06648e335af10bd876c293f9902417ade2722b0f152f68b636aa5a6ef0592" exitCode=0 Mar 08 03:11:47.674702 master-0 kubenswrapper[7648]: I0308 03:11:47.674661 7648 generic.go:334] "Generic (PLEG): container finished" podID="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" containerID="a24cd319d12c3bab1bf2b10e5afaf7c8e507dee6da981a24060116593e6e64aa" exitCode=0 Mar 08 03:11:47.676600 master-0 kubenswrapper[7648]: I0308 03:11:47.676535 7648 generic.go:334] "Generic (PLEG): container finished" podID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerID="6f47524787fe6d12f2f00918cc138535f7c801d780aa325200500bc9264d2c6c" exitCode=0 Mar 08 03:11:47.680056 master-0 kubenswrapper[7648]: I0308 03:11:47.680024 7648 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="2ecfec74d59cc3d0b000968048ba4bb7931b60227ed12aaa53445141ec092ff9" exitCode=0 Mar 08 03:11:47.712579 master-0 kubenswrapper[7648]: E0308 03:11:47.712453 7648 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:11:47.723624 master-0 kubenswrapper[7648]: I0308 03:11:47.723581 7648 manager.go:324] Recovery completed Mar 08 03:11:47.766423 master-0 kubenswrapper[7648]: I0308 03:11:47.766277 7648 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:11:47.766423 master-0 kubenswrapper[7648]: I0308 03:11:47.766306 7648 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:11:47.766423 master-0 kubenswrapper[7648]: I0308 03:11:47.766350 7648 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:11:47.766752 master-0 kubenswrapper[7648]: I0308 03:11:47.766690 7648 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 03:11:47.766752 master-0 kubenswrapper[7648]: I0308 03:11:47.766710 7648 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 03:11:47.766867 master-0 kubenswrapper[7648]: I0308 03:11:47.766759 7648 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 03:11:47.766867 master-0 kubenswrapper[7648]: I0308 03:11:47.766772 7648 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 03:11:47.766867 master-0 kubenswrapper[7648]: I0308 03:11:47.766783 7648 policy_none.go:49] "None policy: Start" Mar 08 03:11:47.768260 master-0 kubenswrapper[7648]: I0308 03:11:47.768224 7648 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:11:47.768260 master-0 kubenswrapper[7648]: I0308 03:11:47.768253 7648 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:11:47.768558 master-0 kubenswrapper[7648]: I0308 03:11:47.768517 7648 state_mem.go:75] "Updated machine memory state" Mar 08 03:11:47.768643 master-0 kubenswrapper[7648]: I0308 03:11:47.768559 7648 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 03:11:47.782751 master-0 kubenswrapper[7648]: I0308 03:11:47.782714 7648 manager.go:334] "Starting Device Plugin manager" Mar 08 03:11:47.782888 master-0 kubenswrapper[7648]: I0308 03:11:47.782811 7648 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:11:47.782998 master-0 kubenswrapper[7648]: I0308 03:11:47.782950 7648 server.go:79] "Starting device plugin registration server" Mar 08 03:11:47.783631 master-0 kubenswrapper[7648]: I0308 03:11:47.783591 7648 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:11:47.784204 master-0 kubenswrapper[7648]: I0308 03:11:47.783758 7648 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:11:47.784335 master-0 kubenswrapper[7648]: I0308 03:11:47.784309 7648 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:11:47.784567 master-0 kubenswrapper[7648]: I0308 03:11:47.784534 7648 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:11:47.784647 master-0 kubenswrapper[7648]: I0308 03:11:47.784580 7648 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:11:47.887717 master-0 kubenswrapper[7648]: I0308 03:11:47.887615 7648 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:11:47.889353 master-0 kubenswrapper[7648]: I0308 03:11:47.889324 7648 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:11:47.889412 master-0 kubenswrapper[7648]: I0308 03:11:47.889360 7648 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:11:47.889412 master-0 kubenswrapper[7648]: I0308 03:11:47.889372 7648 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:11:47.889477 master-0 kubenswrapper[7648]: I0308 03:11:47.889444 7648 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:11:47.903095 master-0 kubenswrapper[7648]: I0308 03:11:47.903030 7648 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 03:11:47.903226 master-0 kubenswrapper[7648]: I0308 03:11:47.903196 7648 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:11:47.913653 master-0 kubenswrapper[7648]: I0308 03:11:47.913595 7648 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:11:47.914180 master-0 kubenswrapper[7648]: I0308 03:11:47.914078 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"6f51a4db85d18d82603b426a557c9c6da1c85541f85af4f912c744b7f3a66c18"} Mar 08 03:11:47.915870 master-0 kubenswrapper[7648]: I0308 03:11:47.915816 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a"} Mar 08 03:11:47.915870 master-0 kubenswrapper[7648]: I0308 03:11:47.915867 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915885 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915913 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915929 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915943 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915960 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915977 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459"} Mar 08 03:11:47.916016 master-0 kubenswrapper[7648]: I0308 03:11:47.915993 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916023 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916043 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"66dcb2ef9f56c8175e9938f33a7650abc0b5ef0e638ee33a15fd5eee5cc90aba"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916059 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916078 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916102 7648 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d82a00afa20186cdbfb5aa25ef4be05563376207d7df03a6e01ae58dbf81de" Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916123 7648 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f" Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916137 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916155 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916169 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74"} Mar 08 03:11:47.916277 master-0 kubenswrapper[7648]: I0308 03:11:47.916215 7648 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a" Mar 08 03:11:47.927886 master-0 kubenswrapper[7648]: E0308 03:11:47.927842 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:47.928466 master-0 kubenswrapper[7648]: W0308 03:11:47.928424 7648 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:11:47.928466 master-0 kubenswrapper[7648]: E0308 03:11:47.928458 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:47.928662 master-0 kubenswrapper[7648]: E0308 03:11:47.928544 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.928662 master-0 kubenswrapper[7648]: E0308 03:11:47.928595 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:47.928662 master-0 kubenswrapper[7648]: E0308 03:11:47.928620 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988088 master-0 kubenswrapper[7648]: I0308 03:11:47.988040 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988088 master-0 kubenswrapper[7648]: I0308 03:11:47.988086 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988106 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988125 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988152 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988165 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988181 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988195 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988213 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988227 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988240 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988255 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988268 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988282 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:47.988319 master-0 kubenswrapper[7648]: I0308 03:11:47.988298 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:47.989287 master-0 kubenswrapper[7648]: I0308 03:11:47.988313 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:47.989287 master-0 kubenswrapper[7648]: I0308 03:11:47.988362 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.089319 master-0 kubenswrapper[7648]: I0308 03:11:48.089262 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089320 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089358 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089382 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089404 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089430 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089454 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089475 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089514 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089534 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089554 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.089575 master-0 kubenswrapper[7648]: I0308 03:11:48.089574 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089596 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089618 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089644 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089666 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089688 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089760 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089824 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089857 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089887 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089915 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089942 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.089969 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.090000 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.090027 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.090057 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.090086 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090121 master-0 kubenswrapper[7648]: I0308 03:11:48.090109 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090999 master-0 kubenswrapper[7648]: I0308 03:11:48.090146 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:48.090999 master-0 kubenswrapper[7648]: I0308 03:11:48.090170 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:48.090999 master-0 kubenswrapper[7648]: I0308 03:11:48.090192 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:48.090999 master-0 kubenswrapper[7648]: I0308 03:11:48.090214 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:48.090999 master-0 kubenswrapper[7648]: I0308 03:11:48.090235 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:48.549772 master-0 kubenswrapper[7648]: I0308 03:11:48.549665 7648 apiserver.go:52] "Watching apiserver" Mar 08 03:11:48.564703 master-0 kubenswrapper[7648]: I0308 03:11:48.564653 7648 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:11:48.566581 master-0 kubenswrapper[7648]: I0308 03:11:48.566534 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-r9m2k","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf","openshift-network-operator/network-operator-7c649bf6d4-98n6d","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc","openshift-multus/multus-additional-cni-plugins-5qjn5","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8","openshift-multus/multus-admission-controller-8d675b596-772zs","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x","openshift-multus/network-metrics-daemon-jl9tj","openshift-network-operator/iptables-alerter-g86jc","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr","openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7","openshift-dns-operator/dns-operator-589895fbb7-z45kw","openshift-etcd/etcd-master-0-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c","assisted-installer/assisted-installer-controller-9g2h9","openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775","openshift-ovn-kubernetes/ovnkube-node-krdvz","openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q","openshift-network-diagnostics/network-check-target-l5x6h","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6","openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq","openshift-multus/multus-hfnwm","openshift-network-node-identity/network-node-identity-xjg74","kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:11:48.570026 master-0 kubenswrapper[7648]: I0308 03:11:48.569992 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:11:48.570232 master-0 kubenswrapper[7648]: I0308 03:11:48.570198 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.570940 master-0 kubenswrapper[7648]: I0308 03:11:48.570857 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:48.571920 master-0 kubenswrapper[7648]: I0308 03:11:48.571888 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.572773 master-0 kubenswrapper[7648]: I0308 03:11:48.572751 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:48.573592 master-0 kubenswrapper[7648]: I0308 03:11:48.573558 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:48.574167 master-0 kubenswrapper[7648]: I0308 03:11:48.574121 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.574643 master-0 kubenswrapper[7648]: I0308 03:11:48.574618 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.574954 master-0 kubenswrapper[7648]: I0308 03:11:48.574928 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.575061 master-0 kubenswrapper[7648]: I0308 03:11:48.575033 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:48.575316 master-0 kubenswrapper[7648]: I0308 03:11:48.575290 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.575387 master-0 kubenswrapper[7648]: I0308 03:11:48.575366 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.575498 master-0 kubenswrapper[7648]: I0308 03:11:48.575448 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:48.575610 master-0 kubenswrapper[7648]: I0308 03:11:48.575560 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:48.576339 master-0 kubenswrapper[7648]: I0308 03:11:48.576313 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:48.579387 master-0 kubenswrapper[7648]: I0308 03:11:48.579362 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:11:48.579495 master-0 kubenswrapper[7648]: I0308 03:11:48.579453 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:11:48.579559 master-0 kubenswrapper[7648]: I0308 03:11:48.579540 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:11:48.579639 master-0 kubenswrapper[7648]: I0308 03:11:48.579617 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.579696 master-0 kubenswrapper[7648]: I0308 03:11:48.579684 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:11:48.579727 master-0 kubenswrapper[7648]: I0308 03:11:48.579700 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.579767 master-0 kubenswrapper[7648]: I0308 03:11:48.579752 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.579855 master-0 kubenswrapper[7648]: I0308 03:11:48.579835 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.579891 master-0 kubenswrapper[7648]: I0308 03:11:48.579859 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:11:48.579891 master-0 kubenswrapper[7648]: I0308 03:11:48.579886 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.579975 master-0 kubenswrapper[7648]: I0308 03:11:48.579706 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.580690 master-0 kubenswrapper[7648]: I0308 03:11:48.580658 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:11:48.580821 master-0 kubenswrapper[7648]: I0308 03:11:48.580784 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:11:48.592152 master-0 kubenswrapper[7648]: I0308 03:11:48.591801 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:11:48.592152 master-0 kubenswrapper[7648]: I0308 03:11:48.591982 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.593554 master-0 kubenswrapper[7648]: I0308 03:11:48.593330 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:11:48.593554 master-0 kubenswrapper[7648]: I0308 03:11:48.593443 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:11:48.593554 master-0 kubenswrapper[7648]: I0308 03:11:48.593555 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:11:48.593716 master-0 kubenswrapper[7648]: I0308 03:11:48.593670 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:11:48.593828 master-0 kubenswrapper[7648]: I0308 03:11:48.593778 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:11:48.593828 master-0 kubenswrapper[7648]: I0308 03:11:48.593796 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:11:48.594015 master-0 kubenswrapper[7648]: I0308 03:11:48.593984 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:11:48.594114 master-0 kubenswrapper[7648]: I0308 03:11:48.594086 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:11:48.594192 master-0 kubenswrapper[7648]: I0308 03:11:48.594174 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:11:48.594285 master-0 kubenswrapper[7648]: I0308 03:11:48.594269 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:11:48.594400 master-0 kubenswrapper[7648]: I0308 03:11:48.594378 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:11:48.594550 master-0 kubenswrapper[7648]: I0308 03:11:48.594532 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:11:48.594731 master-0 kubenswrapper[7648]: I0308 03:11:48.594698 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:11:48.595010 master-0 kubenswrapper[7648]: I0308 03:11:48.594962 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:48.595147 master-0 kubenswrapper[7648]: I0308 03:11:48.595101 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596283 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596338 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596338 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596457 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596516 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596558 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596580 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596678 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596731 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596763 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596733 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597027 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597059 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597109 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597132 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597155 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596877 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597174 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597196 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597216 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597233 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596886 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597236 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596912 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597327 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597358 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596990 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597410 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597418 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.596991 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597044 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597093 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597539 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597565 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597572 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597514 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597111 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597581 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597626 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597653 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597714 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597744 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597745 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597833 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:11:48.597841 master-0 kubenswrapper[7648]: I0308 03:11:48.597901 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.597913 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.597937 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.597969 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.597991 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598012 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598068 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598072 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598203 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598211 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598245 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598263 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598267 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598343 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598382 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598425 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598461 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598516 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598557 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598571 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598587 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598560 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598755 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598780 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598598 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598721 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598723 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598827 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598834 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598870 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598895 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598907 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598904 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599000 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599010 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599030 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599033 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599032 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599088 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599142 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.598889 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599049 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599231 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599152 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599256 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599248 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599278 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599090 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599395 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599371 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599472 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599516 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599534 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599563 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599751 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599779 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599800 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599818 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.599790 master-0 kubenswrapper[7648]: I0308 03:11:48.599838 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.599994 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600020 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600035 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600046 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600069 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600104 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600126 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600186 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600231 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600327 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600328 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600370 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600406 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600439 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600460 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600464 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600526 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600552 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600655 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600681 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600702 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600724 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600737 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600747 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600773 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600795 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600818 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600819 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600837 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600858 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600878 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600901 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600907 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600938 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600966 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.600991 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601017 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601040 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601062 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601073 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601082 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601118 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601127 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601148 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601152 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601258 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601259 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601378 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601393 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601389 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601424 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601477 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601496 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601468 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601583 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601612 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601703 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601734 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601770 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601798 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601815 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601833 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601611 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601910 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601927 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601951 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601984 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.601989 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.602092 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.602104 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.602129 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.602155 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:48.602259 master-0 kubenswrapper[7648]: I0308 03:11:48.602321 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.607999 master-0 kubenswrapper[7648]: I0308 03:11:48.607906 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:48.609293 master-0 kubenswrapper[7648]: I0308 03:11:48.609247 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.609531 master-0 kubenswrapper[7648]: I0308 03:11:48.609505 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.609693 master-0 kubenswrapper[7648]: I0308 03:11:48.609657 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.609897 master-0 kubenswrapper[7648]: I0308 03:11:48.609862 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:48.610587 master-0 kubenswrapper[7648]: I0308 03:11:48.610563 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:48.610696 master-0 kubenswrapper[7648]: I0308 03:11:48.610659 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:11:48.610770 master-0 kubenswrapper[7648]: I0308 03:11:48.610666 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.619570 master-0 kubenswrapper[7648]: I0308 03:11:48.619521 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:11:48.619702 master-0 kubenswrapper[7648]: I0308 03:11:48.619537 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:11:48.620422 master-0 kubenswrapper[7648]: I0308 03:11:48.620384 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.620857 master-0 kubenswrapper[7648]: I0308 03:11:48.620824 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:11:48.620935 master-0 kubenswrapper[7648]: I0308 03:11:48.620908 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.621569 master-0 kubenswrapper[7648]: I0308 03:11:48.621405 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:11:48.622136 master-0 kubenswrapper[7648]: I0308 03:11:48.622100 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.625561 master-0 kubenswrapper[7648]: I0308 03:11:48.625515 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.625715 master-0 kubenswrapper[7648]: I0308 03:11:48.625682 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:48.639328 master-0 kubenswrapper[7648]: I0308 03:11:48.639291 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:11:48.642004 master-0 kubenswrapper[7648]: I0308 03:11:48.641979 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.651909 master-0 kubenswrapper[7648]: I0308 03:11:48.651888 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:11:48.671474 master-0 kubenswrapper[7648]: I0308 03:11:48.671365 7648 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:11:48.671923 master-0 kubenswrapper[7648]: I0308 03:11:48.671865 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:11:48.692030 master-0 kubenswrapper[7648]: I0308 03:11:48.691994 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:11:48.698711 master-0 kubenswrapper[7648]: I0308 03:11:48.698680 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:48.703303 master-0 kubenswrapper[7648]: I0308 03:11:48.703241 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.703354 master-0 kubenswrapper[7648]: I0308 03:11:48.703312 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:48.703409 master-0 kubenswrapper[7648]: I0308 03:11:48.703352 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.703409 master-0 kubenswrapper[7648]: I0308 03:11:48.703391 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:48.703520 master-0 kubenswrapper[7648]: I0308 03:11:48.703428 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.703520 master-0 kubenswrapper[7648]: I0308 03:11:48.703464 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:48.703575 master-0 kubenswrapper[7648]: I0308 03:11:48.703544 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.703608 master-0 kubenswrapper[7648]: I0308 03:11:48.703576 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.703636 master-0 kubenswrapper[7648]: I0308 03:11:48.703611 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.703667 master-0 kubenswrapper[7648]: I0308 03:11:48.703642 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.703695 master-0 kubenswrapper[7648]: I0308 03:11:48.703684 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.703742 master-0 kubenswrapper[7648]: I0308 03:11:48.703716 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.703796 master-0 kubenswrapper[7648]: I0308 03:11:48.703757 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.703923 master-0 kubenswrapper[7648]: E0308 03:11:48.703906 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:48.703989 master-0 kubenswrapper[7648]: I0308 03:11:48.703950 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.704066 master-0 kubenswrapper[7648]: I0308 03:11:48.704045 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.704117 master-0 kubenswrapper[7648]: E0308 03:11:48.704106 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.204042373 +0000 UTC m=+1.815360663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:48.704197 master-0 kubenswrapper[7648]: I0308 03:11:48.704185 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.704275 master-0 kubenswrapper[7648]: I0308 03:11:48.704263 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:48.704356 master-0 kubenswrapper[7648]: I0308 03:11:48.704344 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.704505 master-0 kubenswrapper[7648]: I0308 03:11:48.704479 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.704582 master-0 kubenswrapper[7648]: I0308 03:11:48.704570 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.704824 master-0 kubenswrapper[7648]: I0308 03:11:48.704784 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.704869 master-0 kubenswrapper[7648]: I0308 03:11:48.704819 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.704869 master-0 kubenswrapper[7648]: I0308 03:11:48.704842 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.704928 master-0 kubenswrapper[7648]: I0308 03:11:48.704884 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.704928 master-0 kubenswrapper[7648]: I0308 03:11:48.704905 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.704985 master-0 kubenswrapper[7648]: I0308 03:11:48.704948 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.704985 master-0 kubenswrapper[7648]: I0308 03:11:48.704969 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:48.705035 master-0 kubenswrapper[7648]: I0308 03:11:48.704997 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.705061 master-0 kubenswrapper[7648]: I0308 03:11:48.705047 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.705090 master-0 kubenswrapper[7648]: I0308 03:11:48.705067 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.705090 master-0 kubenswrapper[7648]: I0308 03:11:48.705084 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.705154 master-0 kubenswrapper[7648]: I0308 03:11:48.705137 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:48.705189 master-0 kubenswrapper[7648]: I0308 03:11:48.705158 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.705189 master-0 kubenswrapper[7648]: I0308 03:11:48.705177 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:48.705308 master-0 kubenswrapper[7648]: I0308 03:11:48.705217 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.705308 master-0 kubenswrapper[7648]: I0308 03:11:48.705214 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.705308 master-0 kubenswrapper[7648]: I0308 03:11:48.705236 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.705308 master-0 kubenswrapper[7648]: I0308 03:11:48.705284 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.705416 master-0 kubenswrapper[7648]: E0308 03:11:48.703945 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:48.705416 master-0 kubenswrapper[7648]: E0308 03:11:48.704114 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:48.705462 master-0 kubenswrapper[7648]: E0308 03:11:48.705429 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.20540221 +0000 UTC m=+1.816720500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:48.705525 master-0 kubenswrapper[7648]: E0308 03:11:48.705512 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:48.705558 master-0 kubenswrapper[7648]: E0308 03:11:48.705540 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.205531574 +0000 UTC m=+1.816849864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:48.705603 master-0 kubenswrapper[7648]: I0308 03:11:48.705554 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.705603 master-0 kubenswrapper[7648]: I0308 03:11:48.704346 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.705663 master-0 kubenswrapper[7648]: I0308 03:11:48.704575 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.705663 master-0 kubenswrapper[7648]: E0308 03:11:48.705640 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.205633498 +0000 UTC m=+1.816951788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:48.705917 master-0 kubenswrapper[7648]: I0308 03:11:48.705712 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.705917 master-0 kubenswrapper[7648]: E0308 03:11:48.705800 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:48.705917 master-0 kubenswrapper[7648]: I0308 03:11:48.705842 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.705917 master-0 kubenswrapper[7648]: E0308 03:11:48.705848 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.205837705 +0000 UTC m=+1.817155995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706063 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706088 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706121 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706195 7648 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706227 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.206215688 +0000 UTC m=+1.817533978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706257 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706274 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.20626814 +0000 UTC m=+1.817586430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706288 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706327 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706365 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706389 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706407 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706422 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706596 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706616 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706619 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706662 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706683 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.206676714 +0000 UTC m=+1.817995004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706716 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706732 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706751 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706773 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706794 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706810 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706825 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706841 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706857 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706876 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706928 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706940 7648 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706957 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706963 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.206954963 +0000 UTC m=+1.818273243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.706975 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.706999 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707002 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.707016 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.207009435 +0000 UTC m=+1.818327715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707034 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707039 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707063 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707115 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707145 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707146 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707165 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707168 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707187 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707203 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707223 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707250 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707267 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: E0308 03:11:48.707416 7648 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707451 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707504 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707546 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:48.707680 master-0 kubenswrapper[7648]: I0308 03:11:48.707650 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: E0308 03:11:48.707798 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: E0308 03:11:48.707826 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.207819193 +0000 UTC m=+1.819137483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: E0308 03:11:48.707845 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.207840714 +0000 UTC m=+1.819159004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: E0308 03:11:48.708015 7648 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: E0308 03:11:48.708040 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.208033051 +0000 UTC m=+1.819351341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: I0308 03:11:48.708093 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:48.709842 master-0 kubenswrapper[7648]: I0308 03:11:48.708268 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.710187 master-0 kubenswrapper[7648]: I0308 03:11:48.707562 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.710187 master-0 kubenswrapper[7648]: I0308 03:11:48.710073 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:48.711469 master-0 kubenswrapper[7648]: I0308 03:11:48.711440 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:11:48.730838 master-0 kubenswrapper[7648]: I0308 03:11:48.730812 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:11:48.733125 master-0 kubenswrapper[7648]: I0308 03:11:48.733079 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:48.752084 master-0 kubenswrapper[7648]: I0308 03:11:48.751895 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:11:48.756327 master-0 kubenswrapper[7648]: I0308 03:11:48.756265 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.771952 master-0 kubenswrapper[7648]: I0308 03:11:48.771819 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:11:48.778362 master-0 kubenswrapper[7648]: I0308 03:11:48.778344 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.791673 master-0 kubenswrapper[7648]: I0308 03:11:48.791538 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:11:48.798629 master-0 kubenswrapper[7648]: I0308 03:11:48.798595 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.811617 master-0 kubenswrapper[7648]: I0308 03:11:48.811562 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.811861 master-0 kubenswrapper[7648]: I0308 03:11:48.811844 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.812046 master-0 kubenswrapper[7648]: I0308 03:11:48.812029 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812197 master-0 kubenswrapper[7648]: I0308 03:11:48.812178 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812344 master-0 kubenswrapper[7648]: I0308 03:11:48.812327 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812476 master-0 kubenswrapper[7648]: I0308 03:11:48.812464 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812635 master-0 kubenswrapper[7648]: I0308 03:11:48.812623 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812750 master-0 kubenswrapper[7648]: I0308 03:11:48.812738 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.812954 master-0 kubenswrapper[7648]: I0308 03:11:48.811746 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.812995 master-0 kubenswrapper[7648]: I0308 03:11:48.812435 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812995 master-0 kubenswrapper[7648]: I0308 03:11:48.812147 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812995 master-0 kubenswrapper[7648]: I0308 03:11:48.812586 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.812995 master-0 kubenswrapper[7648]: I0308 03:11:48.811960 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.812995 master-0 kubenswrapper[7648]: I0308 03:11:48.812701 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813143 master-0 kubenswrapper[7648]: I0308 03:11:48.812294 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813143 master-0 kubenswrapper[7648]: I0308 03:11:48.812911 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813197 master-0 kubenswrapper[7648]: E0308 03:11:48.813142 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:48.813197 master-0 kubenswrapper[7648]: E0308 03:11:48.813183 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.313168559 +0000 UTC m=+1.924486849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.812945 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813331 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813372 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813399 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813422 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813434 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813451 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813460 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813474 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813501 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813507 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813525 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813542 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813578 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813510 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813609 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813642 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.813670 master-0 kubenswrapper[7648]: I0308 03:11:48.813668 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813720 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813761 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813790 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813829 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813851 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.813877 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.814018 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.814098 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.814146 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.814172 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814367 master-0 kubenswrapper[7648]: I0308 03:11:48.814264 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814872 master-0 kubenswrapper[7648]: I0308 03:11:48.814828 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.814905 master-0 kubenswrapper[7648]: I0308 03:11:48.814891 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.814936 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.814966 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.814998 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815006 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815073 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815102 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815134 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815147 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815163 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815169 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815187 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815215 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: E0308 03:11:48.815230 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815235 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: E0308 03:11:48.815254 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:49.31524594 +0000 UTC m=+1.926564230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815276 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815280 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.815319 master-0 kubenswrapper[7648]: I0308 03:11:48.815301 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:48.833411 master-0 kubenswrapper[7648]: I0308 03:11:48.833352 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:11:48.854770 master-0 kubenswrapper[7648]: I0308 03:11:48.854721 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.863348 master-0 kubenswrapper[7648]: I0308 03:11:48.863325 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:11:48.873449 master-0 kubenswrapper[7648]: I0308 03:11:48.872650 7648 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:11:48.892345 master-0 kubenswrapper[7648]: I0308 03:11:48.892296 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:11:48.913102 master-0 kubenswrapper[7648]: I0308 03:11:48.913068 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:48.926301 master-0 kubenswrapper[7648]: I0308 03:11:48.926267 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:11:48.963045 master-0 kubenswrapper[7648]: I0308 03:11:48.958611 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:48.967145 master-0 kubenswrapper[7648]: I0308 03:11:48.967102 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:48.993477 master-0 kubenswrapper[7648]: I0308 03:11:48.993395 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:11:49.031672 master-0 kubenswrapper[7648]: I0308 03:11:49.031620 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:11:49.046650 master-0 kubenswrapper[7648]: I0308 03:11:49.046606 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:49.063635 master-0 kubenswrapper[7648]: I0308 03:11:49.063549 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:11:49.096520 master-0 kubenswrapper[7648]: I0308 03:11:49.095774 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:11:49.103233 master-0 kubenswrapper[7648]: I0308 03:11:49.103180 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:49.126388 master-0 kubenswrapper[7648]: I0308 03:11:49.126305 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:11:49.145528 master-0 kubenswrapper[7648]: I0308 03:11:49.145197 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:49.165877 master-0 kubenswrapper[7648]: I0308 03:11:49.165824 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:11:49.187537 master-0 kubenswrapper[7648]: I0308 03:11:49.187476 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:11:49.207417 master-0 kubenswrapper[7648]: I0308 03:11:49.207393 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:49.221558 master-0 kubenswrapper[7648]: I0308 03:11:49.221373 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:49.221558 master-0 kubenswrapper[7648]: I0308 03:11:49.221423 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:49.221558 master-0 kubenswrapper[7648]: E0308 03:11:49.221541 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:49.221761 master-0 kubenswrapper[7648]: E0308 03:11:49.221589 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.221570602 +0000 UTC m=+2.832888952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:49.221811 master-0 kubenswrapper[7648]: E0308 03:11:49.221741 7648 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:49.221901 master-0 kubenswrapper[7648]: E0308 03:11:49.221853 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.221812651 +0000 UTC m=+2.833130991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:49.222002 master-0 kubenswrapper[7648]: I0308 03:11:49.221957 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:49.222052 master-0 kubenswrapper[7648]: I0308 03:11:49.222018 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222107 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.222373 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.222423 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.222516 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.222453103 +0000 UTC m=+2.833771433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.222559 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.222536146 +0000 UTC m=+2.833854556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222616 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222694 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222762 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222802 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222870 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.222920 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223057 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223097 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223121 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223102395 +0000 UTC m=+2.834420735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223121 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223152 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223138776 +0000 UTC m=+2.834457106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223194 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223176818 +0000 UTC m=+2.834495208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.223238 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223279 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223309 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: I0308 03:11:49.223342 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:49.223379 master-0 kubenswrapper[7648]: E0308 03:11:49.223360 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223338493 +0000 UTC m=+2.834656783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223433 7648 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223439 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223418086 +0000 UTC m=+2.834736426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223536 7648 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223535 7648 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223573 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223559291 +0000 UTC m=+2.834877651 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223368 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223593 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223583052 +0000 UTC m=+2.834901442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223612 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223600542 +0000 UTC m=+2.834918882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:49.224026 master-0 kubenswrapper[7648]: E0308 03:11:49.223628 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.223622343 +0000 UTC m=+2.834940853 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:49.228115 master-0 kubenswrapper[7648]: I0308 03:11:49.228030 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:49.247130 master-0 kubenswrapper[7648]: I0308 03:11:49.246966 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:49.267235 master-0 kubenswrapper[7648]: I0308 03:11:49.267183 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:11:49.288015 master-0 kubenswrapper[7648]: I0308 03:11:49.287966 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:49.304702 master-0 kubenswrapper[7648]: I0308 03:11:49.304668 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:11:49.325130 master-0 kubenswrapper[7648]: I0308 03:11:49.325018 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:49.325130 master-0 kubenswrapper[7648]: I0308 03:11:49.325127 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:49.325384 master-0 kubenswrapper[7648]: E0308 03:11:49.325246 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:49.325384 master-0 kubenswrapper[7648]: E0308 03:11:49.325288 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.325275521 +0000 UTC m=+2.936593811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:49.326347 master-0 kubenswrapper[7648]: E0308 03:11:49.325606 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:49.326347 master-0 kubenswrapper[7648]: E0308 03:11:49.325632 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:50.325625193 +0000 UTC m=+2.936943473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:49.327906 master-0 kubenswrapper[7648]: I0308 03:11:49.327872 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:11:49.343352 master-0 kubenswrapper[7648]: I0308 03:11:49.343051 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:49.362306 master-0 kubenswrapper[7648]: I0308 03:11:49.362259 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:49.421454 master-0 kubenswrapper[7648]: E0308 03:11:49.420983 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:11:49.424871 master-0 kubenswrapper[7648]: E0308 03:11:49.424663 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:11:49.430362 master-0 kubenswrapper[7648]: E0308 03:11:49.430015 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:49.437625 master-0 kubenswrapper[7648]: E0308 03:11:49.437356 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:49.460034 master-0 kubenswrapper[7648]: W0308 03:11:49.459519 7648 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 08 03:11:49.460034 master-0 kubenswrapper[7648]: E0308 03:11:49.459612 7648 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:11:49.486817 master-0 kubenswrapper[7648]: I0308 03:11:49.486763 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:49.509932 master-0 kubenswrapper[7648]: I0308 03:11:49.509784 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:11:49.534544 master-0 kubenswrapper[7648]: I0308 03:11:49.534421 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:49.556096 master-0 kubenswrapper[7648]: I0308 03:11:49.556024 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:49.568931 master-0 kubenswrapper[7648]: I0308 03:11:49.568873 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:11:49.587242 master-0 kubenswrapper[7648]: I0308 03:11:49.587132 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:11:49.606902 master-0 kubenswrapper[7648]: I0308 03:11:49.606844 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:11:49.629052 master-0 kubenswrapper[7648]: I0308 03:11:49.628982 7648 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:11:49.642873 master-0 kubenswrapper[7648]: I0308 03:11:49.642803 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:49.777911 master-0 kubenswrapper[7648]: I0308 03:11:49.777821 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:49.969870 master-0 kubenswrapper[7648]: E0308 03:11:49.963430 7648 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5" Mar 08 03:11:49.969870 master-0 kubenswrapper[7648]: E0308 03:11:49.963672 7648 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-config-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5,Command:[cluster-config-operator operator --operator-version=$(OPERATOR_IMAGE_VERSION) --authoritative-feature-gate-dir=/available-featuregates],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trhxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:1,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:1,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-64488f9d78-zg4zr_openshift-config-operator(3bf93333-b537-4f23-9c77-6a245b290fe3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 03:11:49.969870 master-0 kubenswrapper[7648]: E0308 03:11:49.964830 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" Mar 08 03:11:50.192531 master-0 kubenswrapper[7648]: I0308 03:11:50.191791 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l5x6h"] Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236653 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236686 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236710 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236740 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236763 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236782 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236798 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236826 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236846 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236865 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236882 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236906 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: I0308 03:11:50.236926 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237028 7648 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237069 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237055766 +0000 UTC m=+4.848374056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237111 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237130 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237123948 +0000 UTC m=+4.848442238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237162 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237178 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.23717338 +0000 UTC m=+4.848491670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237209 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237225 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237220041 +0000 UTC m=+4.848538331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237253 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237268 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237263543 +0000 UTC m=+4.848581833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237338 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237393 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237357706 +0000 UTC m=+4.848676006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237473 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237510 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237502301 +0000 UTC m=+4.848820591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237548 7648 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237569 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237562573 +0000 UTC m=+4.848880863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237611 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237635 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237627695 +0000 UTC m=+4.848945985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237669 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237687 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237682037 +0000 UTC m=+4.849000327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237720 7648 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237734 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237729569 +0000 UTC m=+4.849047859 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237765 7648 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237785 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237777251 +0000 UTC m=+4.849095541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237826 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:50.243555 master-0 kubenswrapper[7648]: E0308 03:11:50.237843 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.237838413 +0000 UTC m=+4.849156703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:50.338341 master-0 kubenswrapper[7648]: I0308 03:11:50.338300 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:50.338563 master-0 kubenswrapper[7648]: I0308 03:11:50.338540 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:50.338803 master-0 kubenswrapper[7648]: E0308 03:11:50.338775 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:50.338875 master-0 kubenswrapper[7648]: E0308 03:11:50.338854 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.338812547 +0000 UTC m=+4.950130837 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:50.338965 master-0 kubenswrapper[7648]: E0308 03:11:50.338949 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:50.339015 master-0 kubenswrapper[7648]: E0308 03:11:50.339009 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:52.338973343 +0000 UTC m=+4.950291633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:50.704591 master-0 kubenswrapper[7648]: I0308 03:11:50.701982 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerStarted","Data":"7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e"} Mar 08 03:11:50.704591 master-0 kubenswrapper[7648]: I0308 03:11:50.703330 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerStarted","Data":"2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9"} Mar 08 03:11:50.704591 master-0 kubenswrapper[7648]: I0308 03:11:50.704405 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerStarted","Data":"ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f"} Mar 08 03:11:50.705777 master-0 kubenswrapper[7648]: I0308 03:11:50.705500 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerStarted","Data":"d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a"} Mar 08 03:11:50.714601 master-0 kubenswrapper[7648]: I0308 03:11:50.714557 7648 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="c01db717080df3353657720c6a107d2d2fc41a3dd4edfcbd6e02a08696eb5639" exitCode=0 Mar 08 03:11:50.715894 master-0 kubenswrapper[7648]: I0308 03:11:50.714672 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerDied","Data":"c01db717080df3353657720c6a107d2d2fc41a3dd4edfcbd6e02a08696eb5639"} Mar 08 03:11:50.717578 master-0 kubenswrapper[7648]: I0308 03:11:50.717545 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" event={"ID":"ba9496ed-060e-4118-9da6-89b82bd49263","Type":"ContainerStarted","Data":"73db2b17db7b45f368583714c7423ad3baed3f0e6461afd93878b41dc72e8454"} Mar 08 03:11:50.737835 master-0 kubenswrapper[7648]: I0308 03:11:50.737787 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerStarted","Data":"5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2"} Mar 08 03:11:50.751225 master-0 kubenswrapper[7648]: I0308 03:11:50.751178 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerStarted","Data":"dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62"} Mar 08 03:11:50.752406 master-0 kubenswrapper[7648]: I0308 03:11:50.752361 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l5x6h" event={"ID":"aa781f72-e72f-47e1-b37a-977340c182c8","Type":"ContainerStarted","Data":"844cbfeef43db656a844b3b7092c201c78fd0e1335bc233850493989096471a0"} Mar 08 03:11:50.752406 master-0 kubenswrapper[7648]: I0308 03:11:50.752405 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l5x6h" event={"ID":"aa781f72-e72f-47e1-b37a-977340c182c8","Type":"ContainerStarted","Data":"ac8609fedee0058569da7d8a957a2d1f873b1e97869f7515236d24de03c1a1d3"} Mar 08 03:11:50.753931 master-0 kubenswrapper[7648]: I0308 03:11:50.753896 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerStarted","Data":"1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22"} Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.176602 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn"] Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: E0308 03:11:51.176748 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.176758 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: E0308 03:11:51.176773 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.176779 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.176836 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.176847 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:11:51.177809 master-0 kubenswrapper[7648]: I0308 03:11:51.177089 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:11:51.191140 master-0 kubenswrapper[7648]: I0308 03:11:51.191104 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn"] Mar 08 03:11:51.252730 master-0 kubenswrapper[7648]: I0308 03:11:51.252662 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:51.254605 master-0 kubenswrapper[7648]: I0308 03:11:51.254540 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzmjd\" (UniqueName: \"kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd\") pod \"csi-snapshot-controller-7577d6f48-j6jpn\" (UID: \"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:11:51.358987 master-0 kubenswrapper[7648]: I0308 03:11:51.358936 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzmjd\" (UniqueName: \"kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd\") pod \"csi-snapshot-controller-7577d6f48-j6jpn\" (UID: \"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:11:51.445327 master-0 kubenswrapper[7648]: I0308 03:11:51.445119 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzmjd\" (UniqueName: \"kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd\") pod \"csi-snapshot-controller-7577d6f48-j6jpn\" (UID: \"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:11:51.503709 master-0 kubenswrapper[7648]: I0308 03:11:51.503642 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:11:51.722332 master-0 kubenswrapper[7648]: I0308 03:11:51.721918 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn"] Mar 08 03:11:51.769770 master-0 kubenswrapper[7648]: I0308 03:11:51.769165 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:51.769770 master-0 kubenswrapper[7648]: I0308 03:11:51.769717 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerStarted","Data":"f330f5d17188d70a58eecf3d0a2330b70f4408aee114d8d6465e47081bd71e07"} Mar 08 03:11:52.063472 master-0 kubenswrapper[7648]: I0308 03:11:52.063051 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42"] Mar 08 03:11:52.063858 master-0 kubenswrapper[7648]: I0308 03:11:52.063832 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:11:52.066845 master-0 kubenswrapper[7648]: I0308 03:11:52.065779 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42"] Mar 08 03:11:52.067541 master-0 kubenswrapper[7648]: I0308 03:11:52.067268 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:11:52.067541 master-0 kubenswrapper[7648]: I0308 03:11:52.067351 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:11:52.183003 master-0 kubenswrapper[7648]: I0308 03:11:52.182892 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxwl\" (UniqueName: \"kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl\") pod \"migrator-57ccdf9b5-xps42\" (UID: \"306b824f-dcfb-4e69-9a23-64dfbae61852\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:11:52.283825 master-0 kubenswrapper[7648]: I0308 03:11:52.283762 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:52.284021 master-0 kubenswrapper[7648]: I0308 03:11:52.283852 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:52.284021 master-0 kubenswrapper[7648]: I0308 03:11:52.283899 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxwl\" (UniqueName: \"kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl\") pod \"migrator-57ccdf9b5-xps42\" (UID: \"306b824f-dcfb-4e69-9a23-64dfbae61852\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:11:52.284021 master-0 kubenswrapper[7648]: I0308 03:11:52.283940 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:52.284021 master-0 kubenswrapper[7648]: I0308 03:11:52.283981 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:52.284253 master-0 kubenswrapper[7648]: I0308 03:11:52.284020 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:52.284253 master-0 kubenswrapper[7648]: I0308 03:11:52.284081 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:52.284253 master-0 kubenswrapper[7648]: I0308 03:11:52.284157 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:52.284253 master-0 kubenswrapper[7648]: I0308 03:11:52.284209 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:52.284547 master-0 kubenswrapper[7648]: I0308 03:11:52.284256 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:52.284547 master-0 kubenswrapper[7648]: I0308 03:11:52.284334 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:52.284547 master-0 kubenswrapper[7648]: I0308 03:11:52.284381 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:52.284547 master-0 kubenswrapper[7648]: E0308 03:11:52.284518 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:52.284547 master-0 kubenswrapper[7648]: E0308 03:11:52.284532 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284558 7648 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284587 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.284567152 +0000 UTC m=+8.895885432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284661 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284698 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284753 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.284599803 +0000 UTC m=+8.895918093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "performance-addon-operator-webhook-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: I0308 03:11:52.284781 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: I0308 03:11:52.284813 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284829 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284863 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284910 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.284881253 +0000 UTC m=+8.896199573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284921 7648 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284935 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.284923535 +0000 UTC m=+8.896241855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284949 7648 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284977 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.284957 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.284945305 +0000 UTC m=+8.896263635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285009 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285018 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.285005017 +0000 UTC m=+8.896323427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285034 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.285026278 +0000 UTC m=+8.896344708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285051 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.285042189 +0000 UTC m=+8.896360649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285066 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.285060009 +0000 UTC m=+8.896378429 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285079 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.2850726 +0000 UTC m=+8.896391020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285084 7648 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285092 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.28508632 +0000 UTC m=+8.896404760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285127 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls podName:bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.285116511 +0000 UTC m=+8.896434911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-nnd8x" (UID: "bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c") : secret "node-tuning-operator-tls" not found Mar 08 03:11:52.285390 master-0 kubenswrapper[7648]: E0308 03:11:52.285181 7648 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:52.287946 master-0 kubenswrapper[7648]: E0308 03:11:52.285978 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.28596394 +0000 UTC m=+8.897282310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:52.310130 master-0 kubenswrapper[7648]: I0308 03:11:52.310067 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxwl\" (UniqueName: \"kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl\") pod \"migrator-57ccdf9b5-xps42\" (UID: \"306b824f-dcfb-4e69-9a23-64dfbae61852\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:11:52.385982 master-0 kubenswrapper[7648]: I0308 03:11:52.385929 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:52.386193 master-0 kubenswrapper[7648]: E0308 03:11:52.386149 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:52.386272 master-0 kubenswrapper[7648]: E0308 03:11:52.386246 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.38622496 +0000 UTC m=+8.997543350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:52.386699 master-0 kubenswrapper[7648]: I0308 03:11:52.386661 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:52.386902 master-0 kubenswrapper[7648]: E0308 03:11:52.386881 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:52.387001 master-0 kubenswrapper[7648]: E0308 03:11:52.386916 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.386906614 +0000 UTC m=+8.998224904 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:52.389689 master-0 kubenswrapper[7648]: I0308 03:11:52.389628 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:11:52.472880 master-0 kubenswrapper[7648]: I0308 03:11:52.472727 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx"] Mar 08 03:11:52.473297 master-0 kubenswrapper[7648]: I0308 03:11:52.473274 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.477857 master-0 kubenswrapper[7648]: I0308 03:11:52.477821 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:11:52.478144 master-0 kubenswrapper[7648]: I0308 03:11:52.477926 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:11:52.478144 master-0 kubenswrapper[7648]: I0308 03:11:52.478118 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:11:52.480515 master-0 kubenswrapper[7648]: I0308 03:11:52.478290 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:11:52.480515 master-0 kubenswrapper[7648]: I0308 03:11:52.478336 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:11:52.480515 master-0 kubenswrapper[7648]: I0308 03:11:52.478522 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:11:52.493711 master-0 kubenswrapper[7648]: I0308 03:11:52.491900 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx"] Mar 08 03:11:52.588671 master-0 kubenswrapper[7648]: I0308 03:11:52.588601 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.588968 master-0 kubenswrapper[7648]: I0308 03:11:52.588802 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.588968 master-0 kubenswrapper[7648]: I0308 03:11:52.588871 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.588968 master-0 kubenswrapper[7648]: I0308 03:11:52.588944 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.589191 master-0 kubenswrapper[7648]: I0308 03:11:52.588972 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzflb\" (UniqueName: \"kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690203 master-0 kubenswrapper[7648]: I0308 03:11:52.690087 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690203 master-0 kubenswrapper[7648]: I0308 03:11:52.690154 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690203 master-0 kubenswrapper[7648]: I0308 03:11:52.690202 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690506 master-0 kubenswrapper[7648]: I0308 03:11:52.690224 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzflb\" (UniqueName: \"kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690506 master-0 kubenswrapper[7648]: I0308 03:11:52.690254 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.690506 master-0 kubenswrapper[7648]: E0308 03:11:52.690374 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 03:11:52.690506 master-0 kubenswrapper[7648]: E0308 03:11:52.690435 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.190416218 +0000 UTC m=+5.801734508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "openshift-global-ca" not found Mar 08 03:11:52.690723 master-0 kubenswrapper[7648]: E0308 03:11:52.690704 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:52.690794 master-0 kubenswrapper[7648]: E0308 03:11:52.690731 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.190722678 +0000 UTC m=+5.802040958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : secret "serving-cert" not found Mar 08 03:11:52.690794 master-0 kubenswrapper[7648]: E0308 03:11:52.690760 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 03:11:52.690794 master-0 kubenswrapper[7648]: E0308 03:11:52.690777 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.19077146 +0000 UTC m=+5.802089750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "config" not found Mar 08 03:11:52.690957 master-0 kubenswrapper[7648]: E0308 03:11:52.690799 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:52.690957 master-0 kubenswrapper[7648]: E0308 03:11:52.690817 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:53.190811031 +0000 UTC m=+5.802129321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "client-ca" not found Mar 08 03:11:52.714060 master-0 kubenswrapper[7648]: I0308 03:11:52.713894 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzflb\" (UniqueName: \"kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:52.859328 master-0 kubenswrapper[7648]: I0308 03:11:52.859285 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:52.980427 master-0 kubenswrapper[7648]: I0308 03:11:52.980313 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:52.983713 master-0 kubenswrapper[7648]: I0308 03:11:52.983673 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:53.020056 master-0 kubenswrapper[7648]: I0308 03:11:53.019995 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:53.026707 master-0 kubenswrapper[7648]: I0308 03:11:53.026663 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:53.195556 master-0 kubenswrapper[7648]: I0308 03:11:53.195469 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.195717 master-0 kubenswrapper[7648]: I0308 03:11:53.195613 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.195964 master-0 kubenswrapper[7648]: E0308 03:11:53.195890 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:53.196021 master-0 kubenswrapper[7648]: E0308 03:11:53.195963 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:53.196021 master-0 kubenswrapper[7648]: E0308 03:11:53.195977 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 08 03:11:53.196021 master-0 kubenswrapper[7648]: I0308 03:11:53.195899 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.196144 master-0 kubenswrapper[7648]: E0308 03:11:53.196010 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.195995685 +0000 UTC m=+6.807313975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "client-ca" not found Mar 08 03:11:53.196144 master-0 kubenswrapper[7648]: E0308 03:11:53.196110 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.196089968 +0000 UTC m=+6.807408298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : secret "serving-cert" not found Mar 08 03:11:53.196144 master-0 kubenswrapper[7648]: E0308 03:11:53.196136 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.196122829 +0000 UTC m=+6.807441149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "config" not found Mar 08 03:11:53.196254 master-0 kubenswrapper[7648]: I0308 03:11:53.196224 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.196585 master-0 kubenswrapper[7648]: E0308 03:11:53.196548 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 08 03:11:53.196637 master-0 kubenswrapper[7648]: E0308 03:11:53.196607 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.196592985 +0000 UTC m=+6.807911305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "openshift-global-ca" not found Mar 08 03:11:53.320857 master-0 kubenswrapper[7648]: I0308 03:11:53.320752 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:53.359606 master-0 kubenswrapper[7648]: I0308 03:11:53.359430 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:53.679051 master-0 kubenswrapper[7648]: I0308 03:11:53.679009 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx"] Mar 08 03:11:53.679356 master-0 kubenswrapper[7648]: E0308 03:11:53.679223 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" podUID="34f52c72-c20a-4439-a48f-2030ffe30ff8" Mar 08 03:11:53.712003 master-0 kubenswrapper[7648]: I0308 03:11:53.711856 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz"] Mar 08 03:11:53.712318 master-0 kubenswrapper[7648]: I0308 03:11:53.712297 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.716820 master-0 kubenswrapper[7648]: I0308 03:11:53.716773 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:11:53.717001 master-0 kubenswrapper[7648]: I0308 03:11:53.716935 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:11:53.717049 master-0 kubenswrapper[7648]: I0308 03:11:53.717028 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:11:53.717243 master-0 kubenswrapper[7648]: I0308 03:11:53.717171 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:11:53.717335 master-0 kubenswrapper[7648]: I0308 03:11:53.717288 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:11:53.730841 master-0 kubenswrapper[7648]: I0308 03:11:53.730804 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz"] Mar 08 03:11:53.777768 master-0 kubenswrapper[7648]: I0308 03:11:53.777653 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g86jc" event={"ID":"275be8d3-df30-46f7-9d0a-806e404dfd57","Type":"ContainerStarted","Data":"8b920688f81c840f48c6b7c7a0f78fe047269aac2b75906f7ef5b4553f43eba3"} Mar 08 03:11:53.778176 master-0 kubenswrapper[7648]: I0308 03:11:53.778100 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:53.778176 master-0 kubenswrapper[7648]: I0308 03:11:53.778162 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:53.778284 master-0 kubenswrapper[7648]: I0308 03:11:53.778175 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.783090 master-0 kubenswrapper[7648]: I0308 03:11:53.783062 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:53.784318 master-0 kubenswrapper[7648]: I0308 03:11:53.784017 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 08 03:11:53.786666 master-0 kubenswrapper[7648]: I0308 03:11:53.786618 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:53.791948 master-0 kubenswrapper[7648]: I0308 03:11:53.791910 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-fqhlq"] Mar 08 03:11:53.794520 master-0 kubenswrapper[7648]: I0308 03:11:53.794416 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:53.795562 master-0 kubenswrapper[7648]: I0308 03:11:53.795533 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-fqhlq"] Mar 08 03:11:53.796883 master-0 kubenswrapper[7648]: I0308 03:11:53.796819 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:11:53.797204 master-0 kubenswrapper[7648]: I0308 03:11:53.797185 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:11:53.797386 master-0 kubenswrapper[7648]: I0308 03:11:53.797368 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:11:53.801114 master-0 kubenswrapper[7648]: I0308 03:11:53.801079 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:11:53.809567 master-0 kubenswrapper[7648]: I0308 03:11:53.807788 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvszw\" (UniqueName: \"kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.810370 master-0 kubenswrapper[7648]: I0308 03:11:53.810325 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.810434 master-0 kubenswrapper[7648]: I0308 03:11:53.810398 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.810662 master-0 kubenswrapper[7648]: I0308 03:11:53.810630 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913242 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzflb\" (UniqueName: \"kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb\") pod \"34f52c72-c20a-4439-a48f-2030ffe30ff8\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913542 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913586 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913634 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913736 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913858 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvszw\" (UniqueName: \"kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.913962 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhh9\" (UniqueName: \"kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:53.914269 master-0 kubenswrapper[7648]: I0308 03:11:53.914014 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:53.915386 master-0 kubenswrapper[7648]: E0308 03:11:53.914712 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:53.915386 master-0 kubenswrapper[7648]: E0308 03:11:53.914797 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.414775689 +0000 UTC m=+7.026093979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:11:53.915386 master-0 kubenswrapper[7648]: E0308 03:11:53.914918 7648 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:53.915386 master-0 kubenswrapper[7648]: E0308 03:11:53.914947 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:54.414937675 +0000 UTC m=+7.026255965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : secret "serving-cert" not found Mar 08 03:11:53.915583 master-0 kubenswrapper[7648]: I0308 03:11:53.915392 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:53.932689 master-0 kubenswrapper[7648]: I0308 03:11:53.924882 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb" (OuterVolumeSpecName: "kube-api-access-vzflb") pod "34f52c72-c20a-4439-a48f-2030ffe30ff8" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8"). InnerVolumeSpecName "kube-api-access-vzflb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:11:53.939411 master-0 kubenswrapper[7648]: I0308 03:11:53.939374 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvszw\" (UniqueName: \"kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:54.015580 master-0 kubenswrapper[7648]: I0308 03:11:54.015434 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhh9\" (UniqueName: \"kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.015580 master-0 kubenswrapper[7648]: I0308 03:11:54.015524 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.015905 master-0 kubenswrapper[7648]: I0308 03:11:54.015664 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.016406 master-0 kubenswrapper[7648]: I0308 03:11:54.016360 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzflb\" (UniqueName: \"kubernetes.io/projected/34f52c72-c20a-4439-a48f-2030ffe30ff8-kube-api-access-vzflb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:54.017316 master-0 kubenswrapper[7648]: I0308 03:11:54.017278 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.020972 master-0 kubenswrapper[7648]: I0308 03:11:54.020926 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.030862 master-0 kubenswrapper[7648]: I0308 03:11:54.030824 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhh9\" (UniqueName: \"kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.125331 master-0 kubenswrapper[7648]: I0308 03:11:54.125244 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:11:54.222630 master-0 kubenswrapper[7648]: I0308 03:11:54.222398 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.222630 master-0 kubenswrapper[7648]: I0308 03:11:54.222503 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.222630 master-0 kubenswrapper[7648]: I0308 03:11:54.222583 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.222630 master-0 kubenswrapper[7648]: I0308 03:11:54.222624 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: I0308 03:11:54.224790 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: E0308 03:11:54.224888 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: E0308 03:11:54.224939 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.224923621 +0000 UTC m=+8.836241911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : secret "serving-cert" not found Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: I0308 03:11:54.226026 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"controller-manager-6f7fd6c796-mxwsx\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: E0308 03:11:54.226083 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:54.226602 master-0 kubenswrapper[7648]: E0308 03:11:54.226116 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca podName:34f52c72-c20a-4439-a48f-2030ffe30ff8 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.226105862 +0000 UTC m=+8.837424152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca") pod "controller-manager-6f7fd6c796-mxwsx" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8") : configmap "client-ca" not found Mar 08 03:11:54.320299 master-0 kubenswrapper[7648]: I0308 03:11:54.320239 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42"] Mar 08 03:11:54.323923 master-0 kubenswrapper[7648]: I0308 03:11:54.323874 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") pod \"34f52c72-c20a-4439-a48f-2030ffe30ff8\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " Mar 08 03:11:54.323990 master-0 kubenswrapper[7648]: I0308 03:11:54.323935 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") pod \"34f52c72-c20a-4439-a48f-2030ffe30ff8\" (UID: \"34f52c72-c20a-4439-a48f-2030ffe30ff8\") " Mar 08 03:11:54.324565 master-0 kubenswrapper[7648]: I0308 03:11:54.324521 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config" (OuterVolumeSpecName: "config") pod "34f52c72-c20a-4439-a48f-2030ffe30ff8" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:54.324681 master-0 kubenswrapper[7648]: I0308 03:11:54.324630 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "34f52c72-c20a-4439-a48f-2030ffe30ff8" (UID: "34f52c72-c20a-4439-a48f-2030ffe30ff8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:54.427302 master-0 kubenswrapper[7648]: I0308 03:11:54.427188 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:54.427556 master-0 kubenswrapper[7648]: I0308 03:11:54.427335 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:54.427556 master-0 kubenswrapper[7648]: I0308 03:11:54.427436 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:54.427556 master-0 kubenswrapper[7648]: I0308 03:11:54.427511 7648 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:54.427708 master-0 kubenswrapper[7648]: E0308 03:11:54.427683 7648 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:54.427957 master-0 kubenswrapper[7648]: E0308 03:11:54.427887 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:55.427867674 +0000 UTC m=+8.039185964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : secret "serving-cert" not found Mar 08 03:11:54.428310 master-0 kubenswrapper[7648]: E0308 03:11:54.428276 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:54.428474 master-0 kubenswrapper[7648]: E0308 03:11:54.428442 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:55.428426494 +0000 UTC m=+8.039744784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:11:54.589689 master-0 kubenswrapper[7648]: W0308 03:11:54.589648 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod306b824f_dcfb_4e69_9a23_64dfbae61852.slice/crio-df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996 WatchSource:0}: Error finding container df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996: Status 404 returned error can't find the container with id df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996 Mar 08 03:11:54.764999 master-0 kubenswrapper[7648]: I0308 03:11:54.764636 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-fqhlq"] Mar 08 03:11:54.775370 master-0 kubenswrapper[7648]: W0308 03:11:54.775337 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f377bf_79c5_4425_b5d1_256961835f62.slice/crio-05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18 WatchSource:0}: Error finding container 05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18: Status 404 returned error can't find the container with id 05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18 Mar 08 03:11:54.783117 master-0 kubenswrapper[7648]: I0308 03:11:54.783079 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" event={"ID":"c9f377bf-79c5-4425-b5d1-256961835f62","Type":"ContainerStarted","Data":"05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18"} Mar 08 03:11:54.784218 master-0 kubenswrapper[7648]: I0308 03:11:54.784191 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" event={"ID":"306b824f-dcfb-4e69-9a23-64dfbae61852","Type":"ContainerStarted","Data":"df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996"} Mar 08 03:11:54.786209 master-0 kubenswrapper[7648]: I0308 03:11:54.786182 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerStarted","Data":"37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf"} Mar 08 03:11:54.788097 master-0 kubenswrapper[7648]: I0308 03:11:54.788070 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerStarted","Data":"d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386"} Mar 08 03:11:54.792037 master-0 kubenswrapper[7648]: I0308 03:11:54.791926 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx" Mar 08 03:11:54.792037 master-0 kubenswrapper[7648]: I0308 03:11:54.791991 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerStarted","Data":"34a41043128393510c095711912036e3de6953d35852c470aeee13ef6010b118"} Mar 08 03:11:54.820499 master-0 kubenswrapper[7648]: I0308 03:11:54.820402 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" podStartSLOduration=0.931161397 podStartE2EDuration="3.8203809s" podCreationTimestamp="2026-03-08 03:11:51 +0000 UTC" firstStartedPulling="2026-03-08 03:11:51.758061583 +0000 UTC m=+4.369379873" lastFinishedPulling="2026-03-08 03:11:54.647281086 +0000 UTC m=+7.258599376" observedRunningTime="2026-03-08 03:11:54.819742768 +0000 UTC m=+7.431061058" watchObservedRunningTime="2026-03-08 03:11:54.8203809 +0000 UTC m=+7.431699200" Mar 08 03:11:54.909528 master-0 kubenswrapper[7648]: I0308 03:11:54.909489 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n"] Mar 08 03:11:54.909954 master-0 kubenswrapper[7648]: I0308 03:11:54.909924 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:54.912810 master-0 kubenswrapper[7648]: I0308 03:11:54.912772 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:11:54.912947 master-0 kubenswrapper[7648]: I0308 03:11:54.912927 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:11:54.913108 master-0 kubenswrapper[7648]: I0308 03:11:54.913089 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:11:54.913457 master-0 kubenswrapper[7648]: I0308 03:11:54.913435 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:11:54.913810 master-0 kubenswrapper[7648]: I0308 03:11:54.913787 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:11:54.914107 master-0 kubenswrapper[7648]: I0308 03:11:54.914072 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx"] Mar 08 03:11:54.930508 master-0 kubenswrapper[7648]: I0308 03:11:54.929027 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-mxwsx"] Mar 08 03:11:54.936422 master-0 kubenswrapper[7648]: I0308 03:11:54.936319 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:11:54.937718 master-0 kubenswrapper[7648]: I0308 03:11:54.937019 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n"] Mar 08 03:11:55.040161 master-0 kubenswrapper[7648]: I0308 03:11:55.040104 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.040414 master-0 kubenswrapper[7648]: I0308 03:11:55.040369 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.040523 master-0 kubenswrapper[7648]: I0308 03:11:55.040418 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.040523 master-0 kubenswrapper[7648]: I0308 03:11:55.040514 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4b8\" (UniqueName: \"kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.040658 master-0 kubenswrapper[7648]: I0308 03:11:55.040630 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.040749 master-0 kubenswrapper[7648]: I0308 03:11:55.040731 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f52c72-c20a-4439-a48f-2030ffe30ff8-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:55.040749 master-0 kubenswrapper[7648]: I0308 03:11:55.040748 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34f52c72-c20a-4439-a48f-2030ffe30ff8-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:55.143589 master-0 kubenswrapper[7648]: I0308 03:11:55.143108 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.143589 master-0 kubenswrapper[7648]: I0308 03:11:55.143177 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.143589 master-0 kubenswrapper[7648]: I0308 03:11:55.143242 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4b8\" (UniqueName: \"kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.143589 master-0 kubenswrapper[7648]: E0308 03:11:55.143503 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:55.143966 master-0 kubenswrapper[7648]: E0308 03:11:55.143622 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:55.643589724 +0000 UTC m=+8.254908014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : configmap "client-ca" not found Mar 08 03:11:55.144122 master-0 kubenswrapper[7648]: I0308 03:11:55.144061 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.144196 master-0 kubenswrapper[7648]: I0308 03:11:55.144154 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.144617 master-0 kubenswrapper[7648]: E0308 03:11:55.144568 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:55.144617 master-0 kubenswrapper[7648]: E0308 03:11:55.144614 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:55.644602419 +0000 UTC m=+8.255920709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : secret "serving-cert" not found Mar 08 03:11:55.145139 master-0 kubenswrapper[7648]: I0308 03:11:55.145096 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.148194 master-0 kubenswrapper[7648]: I0308 03:11:55.148157 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.178393 master-0 kubenswrapper[7648]: I0308 03:11:55.178162 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4b8\" (UniqueName: \"kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.256356 master-0 kubenswrapper[7648]: I0308 03:11:55.256260 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n"] Mar 08 03:11:55.257084 master-0 kubenswrapper[7648]: E0308 03:11:55.257029 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" podUID="24d2f165-4dd2-4b16-86a0-eed3c161d716" Mar 08 03:11:55.447895 master-0 kubenswrapper[7648]: I0308 03:11:55.447735 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:55.447895 master-0 kubenswrapper[7648]: I0308 03:11:55.447794 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:55.448189 master-0 kubenswrapper[7648]: E0308 03:11:55.447925 7648 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:55.448189 master-0 kubenswrapper[7648]: E0308 03:11:55.447974 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:57.447957937 +0000 UTC m=+10.059276227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : secret "serving-cert" not found Mar 08 03:11:55.448292 master-0 kubenswrapper[7648]: E0308 03:11:55.448273 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:55.448366 master-0 kubenswrapper[7648]: E0308 03:11:55.448299 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:57.448292619 +0000 UTC m=+10.059610909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:11:55.619653 master-0 kubenswrapper[7648]: I0308 03:11:55.618329 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f52c72-c20a-4439-a48f-2030ffe30ff8" path="/var/lib/kubelet/pods/34f52c72-c20a-4439-a48f-2030ffe30ff8/volumes" Mar 08 03:11:55.652290 master-0 kubenswrapper[7648]: I0308 03:11:55.652242 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.652290 master-0 kubenswrapper[7648]: I0308 03:11:55.652280 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.652548 master-0 kubenswrapper[7648]: E0308 03:11:55.652413 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:55.652548 master-0 kubenswrapper[7648]: E0308 03:11:55.652497 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.652468404 +0000 UTC m=+9.263786694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : configmap "client-ca" not found Mar 08 03:11:55.652668 master-0 kubenswrapper[7648]: E0308 03:11:55.652629 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:55.652753 master-0 kubenswrapper[7648]: E0308 03:11:55.652727 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:56.652702063 +0000 UTC m=+9.264020423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : secret "serving-cert" not found Mar 08 03:11:55.771704 master-0 kubenswrapper[7648]: I0308 03:11:55.771578 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:55.809630 master-0 kubenswrapper[7648]: I0308 03:11:55.809598 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.810916 master-0 kubenswrapper[7648]: I0308 03:11:55.810597 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" event={"ID":"c9f377bf-79c5-4425-b5d1-256961835f62","Type":"ContainerStarted","Data":"d4ea1844b53b95e64939abf18bf680af5d21c94a78af3eaf8fa2b814c48bf2f0"} Mar 08 03:11:55.822507 master-0 kubenswrapper[7648]: I0308 03:11:55.822077 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:55.827085 master-0 kubenswrapper[7648]: I0308 03:11:55.827032 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" podStartSLOduration=2.827016598 podStartE2EDuration="2.827016598s" podCreationTimestamp="2026-03-08 03:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:11:55.825876199 +0000 UTC m=+8.437194499" watchObservedRunningTime="2026-03-08 03:11:55.827016598 +0000 UTC m=+8.438334888" Mar 08 03:11:55.957063 master-0 kubenswrapper[7648]: I0308 03:11:55.956988 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4b8\" (UniqueName: \"kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8\") pod \"24d2f165-4dd2-4b16-86a0-eed3c161d716\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " Mar 08 03:11:55.957827 master-0 kubenswrapper[7648]: I0308 03:11:55.957079 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles\") pod \"24d2f165-4dd2-4b16-86a0-eed3c161d716\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " Mar 08 03:11:55.957827 master-0 kubenswrapper[7648]: I0308 03:11:55.957120 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config\") pod \"24d2f165-4dd2-4b16-86a0-eed3c161d716\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " Mar 08 03:11:55.958009 master-0 kubenswrapper[7648]: I0308 03:11:55.957956 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config" (OuterVolumeSpecName: "config") pod "24d2f165-4dd2-4b16-86a0-eed3c161d716" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:55.958964 master-0 kubenswrapper[7648]: I0308 03:11:55.958924 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "24d2f165-4dd2-4b16-86a0-eed3c161d716" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:11:55.965501 master-0 kubenswrapper[7648]: I0308 03:11:55.962641 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8" (OuterVolumeSpecName: "kube-api-access-9t4b8") pod "24d2f165-4dd2-4b16-86a0-eed3c161d716" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716"). InnerVolumeSpecName "kube-api-access-9t4b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:11:56.061587 master-0 kubenswrapper[7648]: I0308 03:11:56.061122 7648 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:56.061587 master-0 kubenswrapper[7648]: I0308 03:11:56.061178 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:56.061587 master-0 kubenswrapper[7648]: I0308 03:11:56.061190 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4b8\" (UniqueName: \"kubernetes.io/projected/24d2f165-4dd2-4b16-86a0-eed3c161d716-kube-api-access-9t4b8\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:56.357696 master-0 kubenswrapper[7648]: I0308 03:11:56.357644 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365188 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365244 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365274 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365292 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365323 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365341 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365360 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365376 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365401 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365419 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365439 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365458 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365430 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.365515 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.365633 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.365678 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.365666666 +0000 UTC m=+16.976984956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.365886 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.365941 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.365924905 +0000 UTC m=+16.977243195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366038 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366048 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366059 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.36605342 +0000 UTC m=+16.977371710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366075 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.36606694 +0000 UTC m=+16.977385230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-webhook-server-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366109 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366128 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls podName:5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366122562 +0000 UTC m=+16.977440852 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-gwv4q" (UID: "5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b") : secret "cluster-baremetal-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366163 7648 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366179 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert podName:c4af87e2-50c3-4d08-9326-9c8876a6fd7b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366173374 +0000 UTC m=+16.977491664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert") pod "cluster-version-operator-745944c6b7-f64sq" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b") : secret "cluster-version-operator-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366212 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366245 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366238136 +0000 UTC m=+16.977556416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366281 7648 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366300 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls podName:e74c8bb2-e063-4b60-b3fe-651aa534d029 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366295038 +0000 UTC m=+16.977613328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-4vqgc" (UID: "e74c8bb2-e063-4b60-b3fe-651aa534d029") : secret "image-registry-operator-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366329 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366348 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.36634271 +0000 UTC m=+16.977661000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366381 7648 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366397 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls podName:8f99f81a-fd2d-432e-a3bc-e451342650b1 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366392161 +0000 UTC m=+16.977710451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls") pod "dns-operator-589895fbb7-z45kw" (UID: "8f99f81a-fd2d-432e-a3bc-e451342650b1") : secret "metrics-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366652 7648 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: E0308 03:11:56.366677 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls podName:fd6b827c-70b0-47ed-b07c-c696343248a8 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.366671411 +0000 UTC m=+16.977989701 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls") pod "ingress-operator-677db989d6-r9m2k" (UID: "fd6b827c-70b0-47ed-b07c-c696343248a8") : secret "metrics-tls" not found Mar 08 03:11:56.377554 master-0 kubenswrapper[7648]: I0308 03:11:56.369333 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:56.391207 master-0 kubenswrapper[7648]: I0308 03:11:56.391058 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: I0308 03:11:56.466663 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: I0308 03:11:56.466850 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: E0308 03:11:56.466976 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: E0308 03:11:56.467021 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.467007343 +0000 UTC m=+17.078325633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: E0308 03:11:56.467062 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:11:56.467657 master-0 kubenswrapper[7648]: E0308 03:11:56.467078 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.467073076 +0000 UTC m=+17.078391366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:11:56.668854 master-0 kubenswrapper[7648]: I0308 03:11:56.668795 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:56.668979 master-0 kubenswrapper[7648]: I0308 03:11:56.668886 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca\") pod \"controller-manager-6bfbc9b76d-hns7n\" (UID: \"24d2f165-4dd2-4b16-86a0-eed3c161d716\") " pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:56.669058 master-0 kubenswrapper[7648]: E0308 03:11:56.669009 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:56.669124 master-0 kubenswrapper[7648]: E0308 03:11:56.669107 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:58.669088137 +0000 UTC m=+11.280406427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : secret "serving-cert" not found Mar 08 03:11:56.669520 master-0 kubenswrapper[7648]: E0308 03:11:56.669194 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:56.669520 master-0 kubenswrapper[7648]: E0308 03:11:56.669360 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca podName:24d2f165-4dd2-4b16-86a0-eed3c161d716 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:58.669351096 +0000 UTC m=+11.280669376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca") pod "controller-manager-6bfbc9b76d-hns7n" (UID: "24d2f165-4dd2-4b16-86a0-eed3c161d716") : configmap "client-ca" not found Mar 08 03:11:56.671436 master-0 kubenswrapper[7648]: I0308 03:11:56.671401 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:11:56.817176 master-0 kubenswrapper[7648]: I0308 03:11:56.816674 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" event={"ID":"306b824f-dcfb-4e69-9a23-64dfbae61852","Type":"ContainerStarted","Data":"463f11a2d2c94a41d1280db226e711c6ac95d0fa9aa2ee00ad0317ae4b62bc14"} Mar 08 03:11:56.817176 master-0 kubenswrapper[7648]: I0308 03:11:56.816741 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n" Mar 08 03:11:56.822944 master-0 kubenswrapper[7648]: I0308 03:11:56.822707 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:11:56.883339 master-0 kubenswrapper[7648]: I0308 03:11:56.883298 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55446fdfd6-sn988"] Mar 08 03:11:56.884929 master-0 kubenswrapper[7648]: I0308 03:11:56.884028 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.887530 master-0 kubenswrapper[7648]: I0308 03:11:56.887433 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n"] Mar 08 03:11:56.888268 master-0 kubenswrapper[7648]: I0308 03:11:56.888195 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:11:56.890170 master-0 kubenswrapper[7648]: I0308 03:11:56.890012 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:11:56.891369 master-0 kubenswrapper[7648]: I0308 03:11:56.891171 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:11:56.892506 master-0 kubenswrapper[7648]: I0308 03:11:56.891499 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:11:56.895616 master-0 kubenswrapper[7648]: I0308 03:11:56.893388 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:11:56.897709 master-0 kubenswrapper[7648]: I0308 03:11:56.897663 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bfbc9b76d-hns7n"] Mar 08 03:11:56.898192 master-0 kubenswrapper[7648]: I0308 03:11:56.898151 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:11:56.901747 master-0 kubenswrapper[7648]: I0308 03:11:56.901649 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55446fdfd6-sn988"] Mar 08 03:11:56.930205 master-0 kubenswrapper[7648]: I0308 03:11:56.930157 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x"] Mar 08 03:11:56.941878 master-0 kubenswrapper[7648]: W0308 03:11:56.941803 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6d3abf_d2df_4a6c_b7ab_40b78948ad0c.slice/crio-ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08 WatchSource:0}: Error finding container ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08: Status 404 returned error can't find the container with id ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08 Mar 08 03:11:56.973504 master-0 kubenswrapper[7648]: I0308 03:11:56.973436 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.974395 master-0 kubenswrapper[7648]: I0308 03:11:56.974373 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7hxt\" (UniqueName: \"kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.974599 master-0 kubenswrapper[7648]: I0308 03:11:56.974585 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.974717 master-0 kubenswrapper[7648]: I0308 03:11:56.974705 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.974978 master-0 kubenswrapper[7648]: I0308 03:11:56.974942 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:56.975111 master-0 kubenswrapper[7648]: I0308 03:11:56.975090 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24d2f165-4dd2-4b16-86a0-eed3c161d716-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:56.975111 master-0 kubenswrapper[7648]: I0308 03:11:56.975110 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24d2f165-4dd2-4b16-86a0-eed3c161d716-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:11:57.075822 master-0 kubenswrapper[7648]: I0308 03:11:57.075722 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.075822 master-0 kubenswrapper[7648]: I0308 03:11:57.075806 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.076103 master-0 kubenswrapper[7648]: E0308 03:11:57.075887 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:57.076103 master-0 kubenswrapper[7648]: E0308 03:11:57.075936 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:57.575919406 +0000 UTC m=+10.187237686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : configmap "client-ca" not found Mar 08 03:11:57.076195 master-0 kubenswrapper[7648]: I0308 03:11:57.076174 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.076251 master-0 kubenswrapper[7648]: I0308 03:11:57.076199 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7hxt\" (UniqueName: \"kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.076308 master-0 kubenswrapper[7648]: I0308 03:11:57.076282 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.076374 master-0 kubenswrapper[7648]: E0308 03:11:57.076356 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:57.076412 master-0 kubenswrapper[7648]: E0308 03:11:57.076391 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:57.576381992 +0000 UTC m=+10.187700282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : secret "serving-cert" not found Mar 08 03:11:57.077430 master-0 kubenswrapper[7648]: I0308 03:11:57.077403 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.077656 master-0 kubenswrapper[7648]: I0308 03:11:57.077619 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.101216 master-0 kubenswrapper[7648]: I0308 03:11:57.100883 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7hxt\" (UniqueName: \"kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.404932 master-0 kubenswrapper[7648]: I0308 03:11:57.404857 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:57.405130 master-0 kubenswrapper[7648]: I0308 03:11:57.405118 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:57.405172 master-0 kubenswrapper[7648]: I0308 03:11:57.405138 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:57.447640 master-0 kubenswrapper[7648]: I0308 03:11:57.447575 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:11:57.457677 master-0 kubenswrapper[7648]: I0308 03:11:57.455615 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:11:57.483334 master-0 kubenswrapper[7648]: I0308 03:11:57.482892 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:57.483334 master-0 kubenswrapper[7648]: I0308 03:11:57.483001 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:11:57.484684 master-0 kubenswrapper[7648]: E0308 03:11:57.484631 7648 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:57.484805 master-0 kubenswrapper[7648]: E0308 03:11:57.484707 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:01.484683282 +0000 UTC m=+14.096001582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : secret "serving-cert" not found Mar 08 03:11:57.485461 master-0 kubenswrapper[7648]: E0308 03:11:57.485375 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:57.485679 master-0 kubenswrapper[7648]: E0308 03:11:57.485536 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:01.485466989 +0000 UTC m=+14.096785329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: I0308 03:11:57.584713 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: E0308 03:11:57.585046 7648 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: E0308 03:11:57.585161 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: E0308 03:11:57.585183 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:58.585149429 +0000 UTC m=+11.196467759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : secret "serving-cert" not found Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: I0308 03:11:57.585075 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:57.586128 master-0 kubenswrapper[7648]: E0308 03:11:57.585238 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:11:58.585212011 +0000 UTC m=+11.196530331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : configmap "client-ca" not found Mar 08 03:11:57.627604 master-0 kubenswrapper[7648]: I0308 03:11:57.627467 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d2f165-4dd2-4b16-86a0-eed3c161d716" path="/var/lib/kubelet/pods/24d2f165-4dd2-4b16-86a0-eed3c161d716/volumes" Mar 08 03:11:57.826750 master-0 kubenswrapper[7648]: I0308 03:11:57.823609 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerStarted","Data":"ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08"} Mar 08 03:11:57.829584 master-0 kubenswrapper[7648]: I0308 03:11:57.829499 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" event={"ID":"306b824f-dcfb-4e69-9a23-64dfbae61852","Type":"ContainerStarted","Data":"dd69283a923aae4672f43bfb1b115592f6406259a8e1bd6cd73479fad2bea593"} Mar 08 03:11:57.829707 master-0 kubenswrapper[7648]: I0308 03:11:57.829615 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:11:57.844275 master-0 kubenswrapper[7648]: I0308 03:11:57.844184 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" podStartSLOduration=3.844542412 podStartE2EDuration="5.844163837s" podCreationTimestamp="2026-03-08 03:11:52 +0000 UTC" firstStartedPulling="2026-03-08 03:11:54.594191994 +0000 UTC m=+7.205510284" lastFinishedPulling="2026-03-08 03:11:56.593813419 +0000 UTC m=+9.205131709" observedRunningTime="2026-03-08 03:11:57.843359569 +0000 UTC m=+10.454677859" watchObservedRunningTime="2026-03-08 03:11:57.844163837 +0000 UTC m=+10.455482137" Mar 08 03:11:58.599249 master-0 kubenswrapper[7648]: I0308 03:11:58.599179 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:58.600451 master-0 kubenswrapper[7648]: I0308 03:11:58.599279 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:11:58.600451 master-0 kubenswrapper[7648]: E0308 03:11:58.599685 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:11:58.600451 master-0 kubenswrapper[7648]: E0308 03:11:58.599742 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:00.59972581 +0000 UTC m=+13.211044100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : configmap "client-ca" not found Mar 08 03:11:58.606375 master-0 kubenswrapper[7648]: I0308 03:11:58.606301 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:00.635733 master-0 kubenswrapper[7648]: I0308 03:12:00.635434 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:00.636824 master-0 kubenswrapper[7648]: E0308 03:12:00.635741 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:12:00.636824 master-0 kubenswrapper[7648]: E0308 03:12:00.635892 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.635857315 +0000 UTC m=+17.247175655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : configmap "client-ca" not found Mar 08 03:12:01.559025 master-0 kubenswrapper[7648]: I0308 03:12:01.558981 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:01.559025 master-0 kubenswrapper[7648]: I0308 03:12:01.559028 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:01.559281 master-0 kubenswrapper[7648]: E0308 03:12:01.559227 7648 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 08 03:12:01.559281 master-0 kubenswrapper[7648]: E0308 03:12:01.559272 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:09.55925865 +0000 UTC m=+22.170576940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : secret "serving-cert" not found Mar 08 03:12:01.559648 master-0 kubenswrapper[7648]: E0308 03:12:01.559606 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:12:01.559711 master-0 kubenswrapper[7648]: E0308 03:12:01.559672 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:09.559658523 +0000 UTC m=+22.170976813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:12:02.580764 master-0 kubenswrapper[7648]: I0308 03:12:02.580719 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-8649944765-6thnh"] Mar 08 03:12:02.581601 master-0 kubenswrapper[7648]: I0308 03:12:02.581518 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.587280 master-0 kubenswrapper[7648]: I0308 03:12:02.587229 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:12:02.587280 master-0 kubenswrapper[7648]: I0308 03:12:02.587266 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:12:02.587465 master-0 kubenswrapper[7648]: I0308 03:12:02.587386 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 08 03:12:02.587465 master-0 kubenswrapper[7648]: I0308 03:12:02.587449 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:12:02.588974 master-0 kubenswrapper[7648]: I0308 03:12:02.588937 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:12:02.589189 master-0 kubenswrapper[7648]: I0308 03:12:02.589160 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:12:02.589527 master-0 kubenswrapper[7648]: I0308 03:12:02.589466 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:12:02.589840 master-0 kubenswrapper[7648]: I0308 03:12:02.589811 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:12:02.595960 master-0 kubenswrapper[7648]: I0308 03:12:02.595925 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8649944765-6thnh"] Mar 08 03:12:02.596752 master-0 kubenswrapper[7648]: I0308 03:12:02.596725 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:12:02.602734 master-0 kubenswrapper[7648]: I0308 03:12:02.602691 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 08 03:12:02.669880 master-0 kubenswrapper[7648]: I0308 03:12:02.669566 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670080 master-0 kubenswrapper[7648]: I0308 03:12:02.669915 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670080 master-0 kubenswrapper[7648]: I0308 03:12:02.669938 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670080 master-0 kubenswrapper[7648]: I0308 03:12:02.669954 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670080 master-0 kubenswrapper[7648]: I0308 03:12:02.669990 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsh4\" (UniqueName: \"kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670080 master-0 kubenswrapper[7648]: I0308 03:12:02.670019 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670282 master-0 kubenswrapper[7648]: I0308 03:12:02.670225 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670435 master-0 kubenswrapper[7648]: I0308 03:12:02.670405 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670472 master-0 kubenswrapper[7648]: I0308 03:12:02.670440 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670523 master-0 kubenswrapper[7648]: I0308 03:12:02.670470 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.670557 master-0 kubenswrapper[7648]: I0308 03:12:02.670530 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771245 master-0 kubenswrapper[7648]: I0308 03:12:02.771123 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771245 master-0 kubenswrapper[7648]: I0308 03:12:02.771172 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771245 master-0 kubenswrapper[7648]: I0308 03:12:02.771205 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771245 master-0 kubenswrapper[7648]: I0308 03:12:02.771221 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsh4\" (UniqueName: \"kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771569 master-0 kubenswrapper[7648]: E0308 03:12:02.771300 7648 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:12:02.771569 master-0 kubenswrapper[7648]: E0308 03:12:02.771369 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:03.271349598 +0000 UTC m=+15.882667898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : configmap "audit-0" not found Mar 08 03:12:02.771693 master-0 kubenswrapper[7648]: I0308 03:12:02.771653 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771756 master-0 kubenswrapper[7648]: I0308 03:12:02.771734 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.771925 master-0 kubenswrapper[7648]: E0308 03:12:02.771676 7648 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:12:02.771974 master-0 kubenswrapper[7648]: I0308 03:12:02.771817 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772010 master-0 kubenswrapper[7648]: I0308 03:12:02.771984 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772638 master-0 kubenswrapper[7648]: E0308 03:12:02.772057 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:03.27200229 +0000 UTC m=+15.883320580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : secret "serving-cert" not found Mar 08 03:12:02.772705 master-0 kubenswrapper[7648]: I0308 03:12:02.772514 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772705 master-0 kubenswrapper[7648]: I0308 03:12:02.772567 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772705 master-0 kubenswrapper[7648]: I0308 03:12:02.772639 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772820 master-0 kubenswrapper[7648]: I0308 03:12:02.772726 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.772987 master-0 kubenswrapper[7648]: I0308 03:12:02.772938 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.773114 master-0 kubenswrapper[7648]: I0308 03:12:02.773050 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.773208 master-0 kubenswrapper[7648]: I0308 03:12:02.773166 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.773257 master-0 kubenswrapper[7648]: I0308 03:12:02.772745 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.773356 master-0 kubenswrapper[7648]: I0308 03:12:02.773333 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.778765 master-0 kubenswrapper[7648]: I0308 03:12:02.778729 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.779920 master-0 kubenswrapper[7648]: I0308 03:12:02.779883 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.792593 master-0 kubenswrapper[7648]: I0308 03:12:02.792546 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsh4\" (UniqueName: \"kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:02.849955 master-0 kubenswrapper[7648]: I0308 03:12:02.849854 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerStarted","Data":"8a3da09cabdcb126428fcd447defcc99973fd5db3565d3792f66591da1ac8333"} Mar 08 03:12:02.891267 master-0 kubenswrapper[7648]: I0308 03:12:02.891208 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-ntxqg"] Mar 08 03:12:02.891871 master-0 kubenswrapper[7648]: I0308 03:12:02.891851 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.975648 master-0 kubenswrapper[7648]: I0308 03:12:02.975611 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.975860 master-0 kubenswrapper[7648]: I0308 03:12:02.975683 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.975860 master-0 kubenswrapper[7648]: I0308 03:12:02.975710 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.975953 master-0 kubenswrapper[7648]: I0308 03:12:02.975898 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.975953 master-0 kubenswrapper[7648]: I0308 03:12:02.975934 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976032 master-0 kubenswrapper[7648]: I0308 03:12:02.975993 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976169 master-0 kubenswrapper[7648]: I0308 03:12:02.976119 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976227 master-0 kubenswrapper[7648]: I0308 03:12:02.976187 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976331 master-0 kubenswrapper[7648]: I0308 03:12:02.976310 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976393 master-0 kubenswrapper[7648]: I0308 03:12:02.976362 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976437 master-0 kubenswrapper[7648]: I0308 03:12:02.976404 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976437 master-0 kubenswrapper[7648]: I0308 03:12:02.976429 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2rs\" (UniqueName: \"kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976549 master-0 kubenswrapper[7648]: I0308 03:12:02.976507 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:02.976549 master-0 kubenswrapper[7648]: I0308 03:12:02.976539 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.079978 master-0 kubenswrapper[7648]: I0308 03:12:03.079921 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080066 master-0 kubenswrapper[7648]: I0308 03:12:03.080007 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080066 master-0 kubenswrapper[7648]: I0308 03:12:03.080041 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080135 master-0 kubenswrapper[7648]: I0308 03:12:03.080118 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080223 master-0 kubenswrapper[7648]: I0308 03:12:03.080188 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080262 master-0 kubenswrapper[7648]: I0308 03:12:03.080240 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080333 master-0 kubenswrapper[7648]: I0308 03:12:03.080303 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080398 master-0 kubenswrapper[7648]: I0308 03:12:03.080376 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080431 master-0 kubenswrapper[7648]: I0308 03:12:03.080413 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2rs\" (UniqueName: \"kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080536 master-0 kubenswrapper[7648]: I0308 03:12:03.080478 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080587 master-0 kubenswrapper[7648]: I0308 03:12:03.080546 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080648 master-0 kubenswrapper[7648]: I0308 03:12:03.080625 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080727 master-0 kubenswrapper[7648]: I0308 03:12:03.080702 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080773 master-0 kubenswrapper[7648]: I0308 03:12:03.080740 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.080924 master-0 kubenswrapper[7648]: I0308 03:12:03.080886 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081069 master-0 kubenswrapper[7648]: I0308 03:12:03.081041 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081155 master-0 kubenswrapper[7648]: I0308 03:12:03.081130 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081300 master-0 kubenswrapper[7648]: I0308 03:12:03.081273 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081350 master-0 kubenswrapper[7648]: I0308 03:12:03.081337 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081454 master-0 kubenswrapper[7648]: I0308 03:12:03.081392 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081548 master-0 kubenswrapper[7648]: I0308 03:12:03.081506 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081617 master-0 kubenswrapper[7648]: I0308 03:12:03.081592 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.081785 master-0 kubenswrapper[7648]: I0308 03:12:03.081750 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.082255 master-0 kubenswrapper[7648]: I0308 03:12:03.082201 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.082361 master-0 kubenswrapper[7648]: I0308 03:12:03.082332 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.087314 master-0 kubenswrapper[7648]: I0308 03:12:03.087274 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.099922 master-0 kubenswrapper[7648]: I0308 03:12:03.099831 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.107907 master-0 kubenswrapper[7648]: I0308 03:12:03.107858 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2rs\" (UniqueName: \"kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.231504 master-0 kubenswrapper[7648]: I0308 03:12:03.231432 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:12:03.244560 master-0 kubenswrapper[7648]: W0308 03:12:03.244463 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde90d207_06d6_4778_b1b0_9020b1f2a881.slice/crio-d68987d833cb19d526c9c65ebce6cd5035eac3d46a347e4bbd5b1d23c087987c WatchSource:0}: Error finding container d68987d833cb19d526c9c65ebce6cd5035eac3d46a347e4bbd5b1d23c087987c: Status 404 returned error can't find the container with id d68987d833cb19d526c9c65ebce6cd5035eac3d46a347e4bbd5b1d23c087987c Mar 08 03:12:03.283465 master-0 kubenswrapper[7648]: I0308 03:12:03.283429 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:03.283603 master-0 kubenswrapper[7648]: I0308 03:12:03.283473 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:03.283648 master-0 kubenswrapper[7648]: E0308 03:12:03.283593 7648 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:12:03.283648 master-0 kubenswrapper[7648]: E0308 03:12:03.283628 7648 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:12:03.283730 master-0 kubenswrapper[7648]: E0308 03:12:03.283658 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.283642496 +0000 UTC m=+16.894960786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : configmap "audit-0" not found Mar 08 03:12:03.283730 master-0 kubenswrapper[7648]: E0308 03:12:03.283681 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:04.283667417 +0000 UTC m=+16.894985707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : secret "serving-cert" not found Mar 08 03:12:03.855843 master-0 kubenswrapper[7648]: I0308 03:12:03.855478 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerStarted","Data":"873a972e72df02e333cdd5be8d4415642ae4a31a8ef844e8221962cd437b0309"} Mar 08 03:12:03.856649 master-0 kubenswrapper[7648]: I0308 03:12:03.856235 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:12:03.858079 master-0 kubenswrapper[7648]: I0308 03:12:03.857993 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" event={"ID":"de90d207-06d6-4778-b1b0-9020b1f2a881","Type":"ContainerStarted","Data":"cf6ec2e3d39c266699ed8ca28f0ed2f947e41eb27a2b4787d91f4c747c1f7fb7"} Mar 08 03:12:03.858140 master-0 kubenswrapper[7648]: I0308 03:12:03.858100 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" event={"ID":"de90d207-06d6-4778-b1b0-9020b1f2a881","Type":"ContainerStarted","Data":"d68987d833cb19d526c9c65ebce6cd5035eac3d46a347e4bbd5b1d23c087987c"} Mar 08 03:12:03.906117 master-0 kubenswrapper[7648]: I0308 03:12:03.905996 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" podStartSLOduration=1.905970032 podStartE2EDuration="1.905970032s" podCreationTimestamp="2026-03-08 03:12:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:03.903970513 +0000 UTC m=+16.515288843" watchObservedRunningTime="2026-03-08 03:12:03.905970032 +0000 UTC m=+16.517288362" Mar 08 03:12:04.298239 master-0 kubenswrapper[7648]: I0308 03:12:04.298067 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:04.298239 master-0 kubenswrapper[7648]: I0308 03:12:04.298145 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:04.298688 master-0 kubenswrapper[7648]: E0308 03:12:04.298277 7648 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:12:04.298688 master-0 kubenswrapper[7648]: E0308 03:12:04.298648 7648 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:12:04.298818 master-0 kubenswrapper[7648]: E0308 03:12:04.298748 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:06.298711065 +0000 UTC m=+18.910029385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : configmap "audit-0" not found Mar 08 03:12:04.298818 master-0 kubenswrapper[7648]: E0308 03:12:04.298799 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:06.298782508 +0000 UTC m=+18.910100828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : secret "serving-cert" not found Mar 08 03:12:04.400677 master-0 kubenswrapper[7648]: I0308 03:12:04.399891 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:12:04.400903 master-0 kubenswrapper[7648]: I0308 03:12:04.400791 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:12:04.400903 master-0 kubenswrapper[7648]: I0308 03:12:04.400871 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:12:04.401178 master-0 kubenswrapper[7648]: E0308 03:12:04.401132 7648 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 08 03:12:04.401347 master-0 kubenswrapper[7648]: I0308 03:12:04.401280 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:04.401559 master-0 kubenswrapper[7648]: I0308 03:12:04.401476 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:12:04.401559 master-0 kubenswrapper[7648]: I0308 03:12:04.401536 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:12:04.401788 master-0 kubenswrapper[7648]: E0308 03:12:04.401678 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls podName:4108f513-acef-473a-ab03-f3761b2bd0d8 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.401636577 +0000 UTC m=+33.012954907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-gj775" (UID: "4108f513-acef-473a-ab03-f3761b2bd0d8") : secret "cluster-monitoring-operator-tls" not found Mar 08 03:12:04.401788 master-0 kubenswrapper[7648]: I0308 03:12:04.401741 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:04.401953 master-0 kubenswrapper[7648]: I0308 03:12:04.401823 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:12:04.402021 master-0 kubenswrapper[7648]: E0308 03:12:04.401949 7648 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 08 03:12:04.402021 master-0 kubenswrapper[7648]: E0308 03:12:04.401961 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 08 03:12:04.402021 master-0 kubenswrapper[7648]: E0308 03:12:04.402003 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics podName:eedc7538-9cc6-4bf5-9628-e278310d796b nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.40198658 +0000 UTC m=+33.013304910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-7hsbf" (UID: "eedc7538-9cc6-4bf5-9628-e278310d796b") : secret "marketplace-operator-metrics" not found Mar 08 03:12:04.402234 master-0 kubenswrapper[7648]: E0308 03:12:04.402029 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert podName:4f822854-b9ac-46f2-b03b-e7215fba9208 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.40201443 +0000 UTC m=+33.013332750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert") pod "olm-operator-d64cfc9db-pdgmg" (UID: "4f822854-b9ac-46f2-b03b-e7215fba9208") : secret "olm-operator-serving-cert" not found Mar 08 03:12:04.402234 master-0 kubenswrapper[7648]: I0308 03:12:04.402081 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:12:04.402234 master-0 kubenswrapper[7648]: I0308 03:12:04.402139 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:04.402234 master-0 kubenswrapper[7648]: I0308 03:12:04.402194 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:12:04.402781 master-0 kubenswrapper[7648]: E0308 03:12:04.402740 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 08 03:12:04.402970 master-0 kubenswrapper[7648]: E0308 03:12:04.402905 7648 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 08 03:12:04.403123 master-0 kubenswrapper[7648]: E0308 03:12:04.403098 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert podName:2bbe9b81-0efb-4caa-bacd-55348cd392c6 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.402942853 +0000 UTC m=+33.014261173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-2gxdj" (UID: "2bbe9b81-0efb-4caa-bacd-55348cd392c6") : secret "package-server-manager-serving-cert" not found Mar 08 03:12:04.403272 master-0 kubenswrapper[7648]: E0308 03:12:04.403251 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert podName:d83aa242-606f-4adc-b689-4aa89625b533 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.403230562 +0000 UTC m=+33.014548892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert") pod "catalog-operator-7d9c49f57b-vsnbw" (UID: "d83aa242-606f-4adc-b689-4aa89625b533") : secret "catalog-operator-serving-cert" not found Mar 08 03:12:04.408707 master-0 kubenswrapper[7648]: I0308 03:12:04.408656 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:12:04.408836 master-0 kubenswrapper[7648]: I0308 03:12:04.408759 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:12:04.408982 master-0 kubenswrapper[7648]: I0308 03:12:04.408929 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:12:04.410660 master-0 kubenswrapper[7648]: I0308 03:12:04.410591 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:12:04.411316 master-0 kubenswrapper[7648]: I0308 03:12:04.411260 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:12:04.411652 master-0 kubenswrapper[7648]: I0308 03:12:04.411602 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"cluster-version-operator-745944c6b7-f64sq\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:12:04.475640 master-0 kubenswrapper[7648]: I0308 03:12:04.475533 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:12:04.475876 master-0 kubenswrapper[7648]: I0308 03:12:04.475643 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:12:04.475876 master-0 kubenswrapper[7648]: I0308 03:12:04.475689 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:12:04.476343 master-0 kubenswrapper[7648]: I0308 03:12:04.476292 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:12:04.481880 master-0 kubenswrapper[7648]: I0308 03:12:04.481830 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:12:04.504228 master-0 kubenswrapper[7648]: I0308 03:12:04.504155 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:12:04.504577 master-0 kubenswrapper[7648]: I0308 03:12:04.504403 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:12:04.504707 master-0 kubenswrapper[7648]: E0308 03:12:04.504632 7648 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 08 03:12:04.504841 master-0 kubenswrapper[7648]: E0308 03:12:04.504738 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs podName:7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.504701174 +0000 UTC m=+33.116019504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs") pod "network-metrics-daemon-jl9tj" (UID: "7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c") : secret "metrics-daemon-secret" not found Mar 08 03:12:04.504963 master-0 kubenswrapper[7648]: E0308 03:12:04.504842 7648 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 08 03:12:04.504963 master-0 kubenswrapper[7648]: E0308 03:12:04.504888 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs podName:23b66415-df37-4015-9a0c-69115b3a0739 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:20.50487406 +0000 UTC m=+33.116192390 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs") pod "multus-admission-controller-8d675b596-772zs" (UID: "23b66415-df37-4015-9a0c-69115b3a0739") : secret "multus-admission-controller-secret" not found Mar 08 03:12:04.720388 master-0 kubenswrapper[7648]: I0308 03:12:04.717792 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:04.720388 master-0 kubenswrapper[7648]: E0308 03:12:04.717989 7648 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:12:04.720388 master-0 kubenswrapper[7648]: E0308 03:12:04.718393 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca podName:b0cd3a37-76c7-4158-997d-0b8b2fd81131 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:12.718371578 +0000 UTC m=+25.329689888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca") pod "controller-manager-55446fdfd6-sn988" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131") : configmap "client-ca" not found Mar 08 03:12:04.862714 master-0 kubenswrapper[7648]: I0308 03:12:04.862638 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" event={"ID":"c4af87e2-50c3-4d08-9326-9c8876a6fd7b","Type":"ContainerStarted","Data":"60f35b3e196e060290be05c8adcbfb7dc922d8c1abdf61e0112a61ef38f0180d"} Mar 08 03:12:05.381344 master-0 kubenswrapper[7648]: I0308 03:12:05.381279 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:12:05.381763 master-0 kubenswrapper[7648]: I0308 03:12:05.381727 7648 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:12:05.414163 master-0 kubenswrapper[7648]: I0308 03:12:05.413865 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-z45kw"] Mar 08 03:12:05.417778 master-0 kubenswrapper[7648]: I0308 03:12:05.417725 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-r9m2k"] Mar 08 03:12:05.424934 master-0 kubenswrapper[7648]: I0308 03:12:05.422853 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q"] Mar 08 03:12:05.430590 master-0 kubenswrapper[7648]: I0308 03:12:05.430070 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc"] Mar 08 03:12:05.439282 master-0 kubenswrapper[7648]: W0308 03:12:05.436844 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f8cefbb_0ac8_4d0d_a923_7a863bd4d35b.slice/crio-5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25 WatchSource:0}: Error finding container 5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25: Status 404 returned error can't find the container with id 5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25 Mar 08 03:12:05.443409 master-0 kubenswrapper[7648]: W0308 03:12:05.443149 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74c8bb2_e063_4b60_b3fe_651aa534d029.slice/crio-887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a WatchSource:0}: Error finding container 887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a: Status 404 returned error can't find the container with id 887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a Mar 08 03:12:05.457222 master-0 kubenswrapper[7648]: I0308 03:12:05.457168 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:12:05.868558 master-0 kubenswrapper[7648]: I0308 03:12:05.868471 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerStarted","Data":"5a6e3c3e5ef7bc2875a4438317596870f92270cd7e8853931713a647f7c41386"} Mar 08 03:12:05.869978 master-0 kubenswrapper[7648]: I0308 03:12:05.869924 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerStarted","Data":"5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25"} Mar 08 03:12:05.871073 master-0 kubenswrapper[7648]: I0308 03:12:05.871025 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" event={"ID":"e74c8bb2-e063-4b60-b3fe-651aa534d029","Type":"ContainerStarted","Data":"887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a"} Mar 08 03:12:05.874611 master-0 kubenswrapper[7648]: I0308 03:12:05.873383 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" event={"ID":"8f99f81a-fd2d-432e-a3bc-e451342650b1","Type":"ContainerStarted","Data":"5e30ced5c1465fa4b9f72a89783db5d665983a50641b79b25a38f7c94e44add4"} Mar 08 03:12:06.358456 master-0 kubenswrapper[7648]: I0308 03:12:06.358421 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:06.358653 master-0 kubenswrapper[7648]: I0308 03:12:06.358468 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") pod \"apiserver-8649944765-6thnh\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:06.358653 master-0 kubenswrapper[7648]: E0308 03:12:06.358603 7648 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 08 03:12:06.358653 master-0 kubenswrapper[7648]: E0308 03:12:06.358652 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:10.35863757 +0000 UTC m=+22.969955860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : secret "serving-cert" not found Mar 08 03:12:06.358744 master-0 kubenswrapper[7648]: E0308 03:12:06.358684 7648 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 08 03:12:06.358744 master-0 kubenswrapper[7648]: E0308 03:12:06.358714 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit podName:4bed9990-a0ff-4669-9bcd-ba0882fb8f0d nodeName:}" failed. No retries permitted until 2026-03-08 03:12:10.358707303 +0000 UTC m=+22.970025583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit") pod "apiserver-8649944765-6thnh" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d") : configmap "audit-0" not found Mar 08 03:12:06.773823 master-0 kubenswrapper[7648]: I0308 03:12:06.773602 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:12:07.605073 master-0 kubenswrapper[7648]: I0308 03:12:07.605028 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv"] Mar 08 03:12:07.606314 master-0 kubenswrapper[7648]: I0308 03:12:07.606293 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.609339 master-0 kubenswrapper[7648]: I0308 03:12:07.609240 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:12:07.609564 master-0 kubenswrapper[7648]: I0308 03:12:07.609549 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:12:07.619220 master-0 kubenswrapper[7648]: I0308 03:12:07.619047 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:12:07.683932 master-0 kubenswrapper[7648]: I0308 03:12:07.683875 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nwgh\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.683932 master-0 kubenswrapper[7648]: I0308 03:12:07.683932 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.684130 master-0 kubenswrapper[7648]: I0308 03:12:07.683988 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.684130 master-0 kubenswrapper[7648]: I0308 03:12:07.684110 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.684211 master-0 kubenswrapper[7648]: I0308 03:12:07.684187 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.785212 master-0 kubenswrapper[7648]: I0308 03:12:07.785107 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.785457 master-0 kubenswrapper[7648]: I0308 03:12:07.785403 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nwgh\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.785547 master-0 kubenswrapper[7648]: I0308 03:12:07.785470 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.785547 master-0 kubenswrapper[7648]: I0308 03:12:07.785540 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.786276 master-0 kubenswrapper[7648]: I0308 03:12:07.785630 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.786276 master-0 kubenswrapper[7648]: I0308 03:12:07.786101 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.786276 master-0 kubenswrapper[7648]: I0308 03:12:07.786229 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.786276 master-0 kubenswrapper[7648]: I0308 03:12:07.786251 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.790960 master-0 kubenswrapper[7648]: I0308 03:12:07.790925 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:07.870506 master-0 kubenswrapper[7648]: I0308 03:12:07.865907 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv"] Mar 08 03:12:08.138301 master-0 kubenswrapper[7648]: I0308 03:12:08.130349 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nwgh\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:08.157389 master-0 kubenswrapper[7648]: I0308 03:12:08.149458 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8649944765-6thnh"] Mar 08 03:12:08.157389 master-0 kubenswrapper[7648]: E0308 03:12:08.156819 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-8649944765-6thnh" podUID="4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" Mar 08 03:12:08.242741 master-0 kubenswrapper[7648]: I0308 03:12:08.242696 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:08.281545 master-0 kubenswrapper[7648]: I0308 03:12:08.279892 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9"] Mar 08 03:12:08.284520 master-0 kubenswrapper[7648]: I0308 03:12:08.283550 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.309683 master-0 kubenswrapper[7648]: I0308 03:12:08.309641 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:12:08.309919 master-0 kubenswrapper[7648]: I0308 03:12:08.309864 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:12:08.313542 master-0 kubenswrapper[7648]: I0308 03:12:08.311616 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:12:08.313542 master-0 kubenswrapper[7648]: I0308 03:12:08.313352 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:12:08.315301 master-0 kubenswrapper[7648]: I0308 03:12:08.314397 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9"] Mar 08 03:12:08.402914 master-0 kubenswrapper[7648]: I0308 03:12:08.402738 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.403092 master-0 kubenswrapper[7648]: I0308 03:12:08.403008 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.403092 master-0 kubenswrapper[7648]: I0308 03:12:08.403039 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.403092 master-0 kubenswrapper[7648]: I0308 03:12:08.403064 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.403184 master-0 kubenswrapper[7648]: I0308 03:12:08.403164 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dskxf\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.403521 master-0 kubenswrapper[7648]: I0308 03:12:08.403459 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504390 master-0 kubenswrapper[7648]: I0308 03:12:08.504342 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504390 master-0 kubenswrapper[7648]: I0308 03:12:08.504388 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504390 master-0 kubenswrapper[7648]: I0308 03:12:08.504409 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504656 master-0 kubenswrapper[7648]: I0308 03:12:08.504433 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504656 master-0 kubenswrapper[7648]: I0308 03:12:08.504524 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dskxf\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504656 master-0 kubenswrapper[7648]: I0308 03:12:08.504579 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504747 master-0 kubenswrapper[7648]: I0308 03:12:08.504665 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.504747 master-0 kubenswrapper[7648]: I0308 03:12:08.504707 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.505054 master-0 kubenswrapper[7648]: I0308 03:12:08.505026 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.505655 master-0 kubenswrapper[7648]: E0308 03:12:08.505569 7648 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:12:08.505969 master-0 kubenswrapper[7648]: E0308 03:12:08.505904 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs podName:53254b19-b5b3-4f97-bc64-37be8b2a41b7 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:09.005733003 +0000 UTC m=+21.617051303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-gdwg9" (UID: "53254b19-b5b3-4f97-bc64-37be8b2a41b7") : secret "catalogserver-cert" not found Mar 08 03:12:08.522795 master-0 kubenswrapper[7648]: I0308 03:12:08.522568 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.540760 master-0 kubenswrapper[7648]: I0308 03:12:08.540149 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskxf\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:08.890275 master-0 kubenswrapper[7648]: I0308 03:12:08.890211 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:08.900656 master-0 kubenswrapper[7648]: I0308 03:12:08.900619 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:09.014789 master-0 kubenswrapper[7648]: I0308 03:12:09.014699 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.014789 master-0 kubenswrapper[7648]: I0308 03:12:09.014757 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glsh4\" (UniqueName: \"kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.014789 master-0 kubenswrapper[7648]: I0308 03:12:09.014791 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.014839 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.014868 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.014899 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.014973 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.015002 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.015037 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle\") pod \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\" (UID: \"4bed9990-a0ff-4669-9bcd-ba0882fb8f0d\") " Mar 08 03:12:09.015272 master-0 kubenswrapper[7648]: I0308 03:12:09.015205 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: E0308 03:12:09.015497 7648 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: E0308 03:12:09.015562 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs podName:53254b19-b5b3-4f97-bc64-37be8b2a41b7 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:10.015544936 +0000 UTC m=+22.626863226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-gdwg9" (UID: "53254b19-b5b3-4f97-bc64-37be8b2a41b7") : secret "catalogserver-cert" not found Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: I0308 03:12:09.015828 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: I0308 03:12:09.016015 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: I0308 03:12:09.015490 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config" (OuterVolumeSpecName: "config") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: I0308 03:12:09.016192 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:09.016328 master-0 kubenswrapper[7648]: I0308 03:12:09.016236 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:09.018129 master-0 kubenswrapper[7648]: I0308 03:12:09.016863 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:09.018633 master-0 kubenswrapper[7648]: I0308 03:12:09.018539 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:09.018933 master-0 kubenswrapper[7648]: I0308 03:12:09.018883 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:09.026110 master-0 kubenswrapper[7648]: I0308 03:12:09.025655 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4" (OuterVolumeSpecName: "kube-api-access-glsh4") pod "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" (UID: "4bed9990-a0ff-4669-9bcd-ba0882fb8f0d"). InnerVolumeSpecName "kube-api-access-glsh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:09.116832 master-0 kubenswrapper[7648]: I0308 03:12:09.116756 7648 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.116832 master-0 kubenswrapper[7648]: I0308 03:12:09.116811 7648 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.116832 master-0 kubenswrapper[7648]: I0308 03:12:09.116831 7648 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.116832 master-0 kubenswrapper[7648]: I0308 03:12:09.116850 7648 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.117305 master-0 kubenswrapper[7648]: I0308 03:12:09.116872 7648 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.117305 master-0 kubenswrapper[7648]: I0308 03:12:09.116891 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glsh4\" (UniqueName: \"kubernetes.io/projected/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-kube-api-access-glsh4\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.117305 master-0 kubenswrapper[7648]: I0308 03:12:09.116909 7648 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.117305 master-0 kubenswrapper[7648]: I0308 03:12:09.116928 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.117305 master-0 kubenswrapper[7648]: I0308 03:12:09.116945 7648 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:09.622950 master-0 kubenswrapper[7648]: I0308 03:12:09.622695 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:09.622950 master-0 kubenswrapper[7648]: I0308 03:12:09.622744 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:09.622950 master-0 kubenswrapper[7648]: E0308 03:12:09.622852 7648 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 08 03:12:09.622950 master-0 kubenswrapper[7648]: E0308 03:12:09.622907 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca podName:734e4130-6a6f-4739-9b25-4fe9cb8561c2 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:25.622892605 +0000 UTC m=+38.234210885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca") pod "route-controller-manager-7f5bdf8b74-zn8sz" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2") : configmap "client-ca" not found Mar 08 03:12:09.639260 master-0 kubenswrapper[7648]: I0308 03:12:09.639207 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"route-controller-manager-7f5bdf8b74-zn8sz\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:09.899195 master-0 kubenswrapper[7648]: I0308 03:12:09.897348 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8649944765-6thnh" Mar 08 03:12:09.961516 master-0 kubenswrapper[7648]: I0308 03:12:09.952594 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-778796f487-vzb5n"] Mar 08 03:12:09.961516 master-0 kubenswrapper[7648]: I0308 03:12:09.953765 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:09.961516 master-0 kubenswrapper[7648]: I0308 03:12:09.955148 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:12:09.961516 master-0 kubenswrapper[7648]: I0308 03:12:09.955621 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:12:09.961516 master-0 kubenswrapper[7648]: I0308 03:12:09.956418 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-778796f487-vzb5n"] Mar 08 03:12:09.967413 master-0 kubenswrapper[7648]: I0308 03:12:09.964898 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.972342 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.972604 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.972500 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.973038 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.973703 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8649944765-6thnh"] Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.974277 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:12:09.975815 master-0 kubenswrapper[7648]: I0308 03:12:09.974458 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:12:09.981879 master-0 kubenswrapper[7648]: I0308 03:12:09.981717 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:12:09.983185 master-0 kubenswrapper[7648]: I0308 03:12:09.983146 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-8649944765-6thnh"] Mar 08 03:12:10.043192 master-0 kubenswrapper[7648]: I0308 03:12:10.043135 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043192 master-0 kubenswrapper[7648]: I0308 03:12:10.043182 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043215 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043251 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043273 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043312 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043335 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043350 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043364 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556dx\" (UniqueName: \"kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043386 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043404 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043447 master-0 kubenswrapper[7648]: I0308 03:12:10.043422 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.043827 master-0 kubenswrapper[7648]: E0308 03:12:10.043545 7648 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 08 03:12:10.043827 master-0 kubenswrapper[7648]: E0308 03:12:10.043586 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs podName:53254b19-b5b3-4f97-bc64-37be8b2a41b7 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:12.043570413 +0000 UTC m=+24.654888703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-gdwg9" (UID: "53254b19-b5b3-4f97-bc64-37be8b2a41b7") : secret "catalogserver-cert" not found Mar 08 03:12:10.043827 master-0 kubenswrapper[7648]: I0308 03:12:10.043664 7648 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-audit\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:10.043827 master-0 kubenswrapper[7648]: I0308 03:12:10.043707 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144494 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144702 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144732 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144792 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144824 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.144860 master-0 kubenswrapper[7648]: I0308 03:12:10.144879 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.145202 master-0 kubenswrapper[7648]: I0308 03:12:10.144913 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.145202 master-0 kubenswrapper[7648]: I0308 03:12:10.144933 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.145202 master-0 kubenswrapper[7648]: I0308 03:12:10.144956 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556dx\" (UniqueName: \"kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.145202 master-0 kubenswrapper[7648]: I0308 03:12:10.144987 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.145202 master-0 kubenswrapper[7648]: I0308 03:12:10.145012 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.146549 master-0 kubenswrapper[7648]: I0308 03:12:10.146520 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.147824 master-0 kubenswrapper[7648]: I0308 03:12:10.147791 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.148432 master-0 kubenswrapper[7648]: I0308 03:12:10.148405 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.148497 master-0 kubenswrapper[7648]: I0308 03:12:10.148467 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.148569 master-0 kubenswrapper[7648]: I0308 03:12:10.148547 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.149736 master-0 kubenswrapper[7648]: I0308 03:12:10.149667 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.149788 master-0 kubenswrapper[7648]: I0308 03:12:10.149760 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.152435 master-0 kubenswrapper[7648]: I0308 03:12:10.152404 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.154099 master-0 kubenswrapper[7648]: I0308 03:12:10.154076 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.170445 master-0 kubenswrapper[7648]: I0308 03:12:10.170413 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.179813 master-0 kubenswrapper[7648]: I0308 03:12:10.179771 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556dx\" (UniqueName: \"kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.213797 master-0 kubenswrapper[7648]: I0308 03:12:10.213677 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:10.214869 master-0 kubenswrapper[7648]: I0308 03:12:10.214822 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.220195 master-0 kubenswrapper[7648]: I0308 03:12:10.220143 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:12:10.222244 master-0 kubenswrapper[7648]: I0308 03:12:10.222212 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:10.280729 master-0 kubenswrapper[7648]: I0308 03:12:10.280684 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:10.350115 master-0 kubenswrapper[7648]: I0308 03:12:10.350078 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.350115 master-0 kubenswrapper[7648]: I0308 03:12:10.350116 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.350321 master-0 kubenswrapper[7648]: I0308 03:12:10.350183 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.451900 master-0 kubenswrapper[7648]: I0308 03:12:10.451753 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.452095 master-0 kubenswrapper[7648]: I0308 03:12:10.451941 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.452095 master-0 kubenswrapper[7648]: I0308 03:12:10.451981 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.452095 master-0 kubenswrapper[7648]: I0308 03:12:10.452074 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.452412 master-0 kubenswrapper[7648]: I0308 03:12:10.452369 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.479019 master-0 kubenswrapper[7648]: I0308 03:12:10.478987 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:10.552507 master-0 kubenswrapper[7648]: I0308 03:12:10.552436 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:11.474381 master-0 kubenswrapper[7648]: I0308 03:12:11.474293 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55446fdfd6-sn988"] Mar 08 03:12:11.475319 master-0 kubenswrapper[7648]: E0308 03:12:11.474692 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" podUID="b0cd3a37-76c7-4158-997d-0b8b2fd81131" Mar 08 03:12:11.498001 master-0 kubenswrapper[7648]: I0308 03:12:11.497930 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz"] Mar 08 03:12:11.498394 master-0 kubenswrapper[7648]: E0308 03:12:11.498355 7648 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" podUID="734e4130-6a6f-4739-9b25-4fe9cb8561c2" Mar 08 03:12:11.619701 master-0 kubenswrapper[7648]: I0308 03:12:11.619640 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bed9990-a0ff-4669-9bcd-ba0882fb8f0d" path="/var/lib/kubelet/pods/4bed9990-a0ff-4669-9bcd-ba0882fb8f0d/volumes" Mar 08 03:12:11.904586 master-0 kubenswrapper[7648]: I0308 03:12:11.904517 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:11.904586 master-0 kubenswrapper[7648]: I0308 03:12:11.904533 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:11.915146 master-0 kubenswrapper[7648]: I0308 03:12:11.911197 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:11.915328 master-0 kubenswrapper[7648]: I0308 03:12:11.915296 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:11.973807 master-0 kubenswrapper[7648]: I0308 03:12:11.973707 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config\") pod \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " Mar 08 03:12:11.973807 master-0 kubenswrapper[7648]: I0308 03:12:11.973777 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") pod \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " Mar 08 03:12:11.974107 master-0 kubenswrapper[7648]: I0308 03:12:11.973832 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvszw\" (UniqueName: \"kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw\") pod \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " Mar 08 03:12:11.974107 master-0 kubenswrapper[7648]: I0308 03:12:11.973866 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles\") pod \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " Mar 08 03:12:11.974107 master-0 kubenswrapper[7648]: I0308 03:12:11.973901 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7hxt\" (UniqueName: \"kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt\") pod \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " Mar 08 03:12:11.974107 master-0 kubenswrapper[7648]: I0308 03:12:11.973936 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config\") pod \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " Mar 08 03:12:11.974107 master-0 kubenswrapper[7648]: I0308 03:12:11.973985 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") pod \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\" (UID: \"734e4130-6a6f-4739-9b25-4fe9cb8561c2\") " Mar 08 03:12:11.974587 master-0 kubenswrapper[7648]: I0308 03:12:11.974298 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config" (OuterVolumeSpecName: "config") pod "734e4130-6a6f-4739-9b25-4fe9cb8561c2" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:11.974587 master-0 kubenswrapper[7648]: I0308 03:12:11.974517 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b0cd3a37-76c7-4158-997d-0b8b2fd81131" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:11.975005 master-0 kubenswrapper[7648]: I0308 03:12:11.974935 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config" (OuterVolumeSpecName: "config") pod "b0cd3a37-76c7-4158-997d-0b8b2fd81131" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:11.977683 master-0 kubenswrapper[7648]: I0308 03:12:11.977630 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "734e4130-6a6f-4739-9b25-4fe9cb8561c2" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:11.979090 master-0 kubenswrapper[7648]: I0308 03:12:11.979039 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt" (OuterVolumeSpecName: "kube-api-access-k7hxt") pod "b0cd3a37-76c7-4158-997d-0b8b2fd81131" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131"). InnerVolumeSpecName "kube-api-access-k7hxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:11.986940 master-0 kubenswrapper[7648]: I0308 03:12:11.986887 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b0cd3a37-76c7-4158-997d-0b8b2fd81131" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:11.987704 master-0 kubenswrapper[7648]: I0308 03:12:11.987636 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw" (OuterVolumeSpecName: "kube-api-access-fvszw") pod "734e4130-6a6f-4739-9b25-4fe9cb8561c2" (UID: "734e4130-6a6f-4739-9b25-4fe9cb8561c2"). InnerVolumeSpecName "kube-api-access-fvszw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:12.075717 master-0 kubenswrapper[7648]: I0308 03:12:12.075663 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075751 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075775 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0cd3a37-76c7-4158-997d-0b8b2fd81131-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075788 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvszw\" (UniqueName: \"kubernetes.io/projected/734e4130-6a6f-4739-9b25-4fe9cb8561c2-kube-api-access-fvszw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075798 7648 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075809 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7hxt\" (UniqueName: \"kubernetes.io/projected/b0cd3a37-76c7-4158-997d-0b8b2fd81131-kube-api-access-k7hxt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075819 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.076034 master-0 kubenswrapper[7648]: I0308 03:12:12.075828 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734e4130-6a6f-4739-9b25-4fe9cb8561c2-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:12.080860 master-0 kubenswrapper[7648]: I0308 03:12:12.080793 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:12.223148 master-0 kubenswrapper[7648]: I0308 03:12:12.222956 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:12.745917 master-0 kubenswrapper[7648]: I0308 03:12:12.745874 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv"] Mar 08 03:12:12.767044 master-0 kubenswrapper[7648]: W0308 03:12:12.766427 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e1af4e8_2ade_48b3_8c56_0ab78f77fac9.slice/crio-a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2 WatchSource:0}: Error finding container a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2: Status 404 returned error can't find the container with id a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2 Mar 08 03:12:12.793063 master-0 kubenswrapper[7648]: I0308 03:12:12.793017 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:12.794031 master-0 kubenswrapper[7648]: I0308 03:12:12.793736 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"controller-manager-55446fdfd6-sn988\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:12.826617 master-0 kubenswrapper[7648]: I0308 03:12:12.826559 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-778796f487-vzb5n"] Mar 08 03:12:12.850500 master-0 kubenswrapper[7648]: W0308 03:12:12.845944 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99d6808_9fec_402d_93f7_41575a5a0a08.slice/crio-111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849 WatchSource:0}: Error finding container 111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849: Status 404 returned error can't find the container with id 111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849 Mar 08 03:12:12.889668 master-0 kubenswrapper[7648]: I0308 03:12:12.887078 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:12.917164 master-0 kubenswrapper[7648]: I0308 03:12:12.916623 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-778796f487-vzb5n" event={"ID":"f99d6808-9fec-402d-93f7-41575a5a0a08","Type":"ContainerStarted","Data":"111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849"} Mar 08 03:12:12.918791 master-0 kubenswrapper[7648]: I0308 03:12:12.918510 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" event={"ID":"e74c8bb2-e063-4b60-b3fe-651aa534d029","Type":"ContainerStarted","Data":"5cb8f3acbb7aa9ec545c1b8e4b064d16cbafd48b223783d78db54ee94e2fb56a"} Mar 08 03:12:12.919741 master-0 kubenswrapper[7648]: I0308 03:12:12.919703 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"b9b2988e-7fa3-44ee-be58-51964231a2ab","Type":"ContainerStarted","Data":"dfd20110ce4cf1cfff31e419d57e6348705990b2bdc516a8aae4208278e8e44e"} Mar 08 03:12:12.921118 master-0 kubenswrapper[7648]: I0308 03:12:12.920645 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" event={"ID":"8f99f81a-fd2d-432e-a3bc-e451342650b1","Type":"ContainerStarted","Data":"1e61b9ac37fe51c8960f7a8b93dfbd9d59b0b11cd9f9b7104df5ef3c3c168bec"} Mar 08 03:12:12.925650 master-0 kubenswrapper[7648]: I0308 03:12:12.921544 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" event={"ID":"c4af87e2-50c3-4d08-9326-9c8876a6fd7b","Type":"ContainerStarted","Data":"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b"} Mar 08 03:12:12.925650 master-0 kubenswrapper[7648]: I0308 03:12:12.922621 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerStarted","Data":"a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2"} Mar 08 03:12:12.925650 master-0 kubenswrapper[7648]: I0308 03:12:12.924218 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerStarted","Data":"903120aab74563b4610a95957e1909bb96f27217bb43d514694690518a714fc6"} Mar 08 03:12:12.925650 master-0 kubenswrapper[7648]: I0308 03:12:12.924236 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerStarted","Data":"927e976b2419f80e2b156dd6620627f0ab5b15535fdab986491afec086084730"} Mar 08 03:12:12.925911 master-0 kubenswrapper[7648]: I0308 03:12:12.925795 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55446fdfd6-sn988" Mar 08 03:12:12.928620 master-0 kubenswrapper[7648]: I0308 03:12:12.927033 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerStarted","Data":"e0852f11a27b04eb53659f54af6ee018541ea3eb08a6e316017731af47f5934d"} Mar 08 03:12:12.928620 master-0 kubenswrapper[7648]: I0308 03:12:12.927085 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerStarted","Data":"643f3b1d5189adb625272097c9d23e7af0847cd627439de5de3ccca7ed7bb060"} Mar 08 03:12:12.928620 master-0 kubenswrapper[7648]: I0308 03:12:12.927177 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz" Mar 08 03:12:12.959291 master-0 kubenswrapper[7648]: I0308 03:12:12.958934 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9"] Mar 08 03:12:12.995200 master-0 kubenswrapper[7648]: I0308 03:12:12.994622 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") pod \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\" (UID: \"b0cd3a37-76c7-4158-997d-0b8b2fd81131\") " Mar 08 03:12:12.995782 master-0 kubenswrapper[7648]: I0308 03:12:12.995743 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca" (OuterVolumeSpecName: "client-ca") pod "b0cd3a37-76c7-4158-997d-0b8b2fd81131" (UID: "b0cd3a37-76c7-4158-997d-0b8b2fd81131"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:12.996315 master-0 kubenswrapper[7648]: I0308 03:12:12.996293 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0cd3a37-76c7-4158-997d-0b8b2fd81131-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:13.037202 master-0 kubenswrapper[7648]: I0308 03:12:13.036844 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:13.037398 master-0 kubenswrapper[7648]: I0308 03:12:13.037377 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.038032 master-0 kubenswrapper[7648]: I0308 03:12:13.038006 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz"] Mar 08 03:12:13.040030 master-0 kubenswrapper[7648]: I0308 03:12:13.039133 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f5bdf8b74-zn8sz"] Mar 08 03:12:13.041859 master-0 kubenswrapper[7648]: I0308 03:12:13.040803 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:12:13.042027 master-0 kubenswrapper[7648]: I0308 03:12:13.041997 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:13.042068 master-0 kubenswrapper[7648]: I0308 03:12:13.040898 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:13.042116 master-0 kubenswrapper[7648]: I0308 03:12:13.041098 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:12:13.042158 master-0 kubenswrapper[7648]: I0308 03:12:13.041152 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:12:13.042194 master-0 kubenswrapper[7648]: I0308 03:12:13.041235 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:12:13.096767 master-0 kubenswrapper[7648]: I0308 03:12:13.096675 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.096767 master-0 kubenswrapper[7648]: I0308 03:12:13.096727 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.096767 master-0 kubenswrapper[7648]: I0308 03:12:13.096746 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.099005 master-0 kubenswrapper[7648]: I0308 03:12:13.096807 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmlx5\" (UniqueName: \"kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.099005 master-0 kubenswrapper[7648]: I0308 03:12:13.096847 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/734e4130-6a6f-4739-9b25-4fe9cb8561c2-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:13.197740 master-0 kubenswrapper[7648]: I0308 03:12:13.197340 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.197740 master-0 kubenswrapper[7648]: I0308 03:12:13.197413 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.197740 master-0 kubenswrapper[7648]: I0308 03:12:13.197437 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.197740 master-0 kubenswrapper[7648]: I0308 03:12:13.197539 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlx5\" (UniqueName: \"kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.198953 master-0 kubenswrapper[7648]: I0308 03:12:13.198928 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.203334 master-0 kubenswrapper[7648]: I0308 03:12:13.202659 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.203334 master-0 kubenswrapper[7648]: I0308 03:12:13.203164 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.225296 master-0 kubenswrapper[7648]: I0308 03:12:13.225255 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlx5\" (UniqueName: \"kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5\") pod \"route-controller-manager-5874dc4b9-7nmkc\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.314220 master-0 kubenswrapper[7648]: I0308 03:12:13.314177 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55446fdfd6-sn988"] Mar 08 03:12:13.324572 master-0 kubenswrapper[7648]: I0308 03:12:13.324531 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55446fdfd6-sn988"] Mar 08 03:12:13.392039 master-0 kubenswrapper[7648]: I0308 03:12:13.391984 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.409304 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-htnv4"] Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.410010 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.412580 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.413120 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.413250 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:12:13.414706 master-0 kubenswrapper[7648]: I0308 03:12:13.414108 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:12:13.422304 master-0 kubenswrapper[7648]: I0308 03:12:13.422107 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-htnv4"] Mar 08 03:12:13.609047 master-0 kubenswrapper[7648]: I0308 03:12:13.609011 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.609210 master-0 kubenswrapper[7648]: I0308 03:12:13.609064 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.609210 master-0 kubenswrapper[7648]: I0308 03:12:13.609087 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-885mp\" (UniqueName: \"kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.629286 master-0 kubenswrapper[7648]: I0308 03:12:13.628648 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734e4130-6a6f-4739-9b25-4fe9cb8561c2" path="/var/lib/kubelet/pods/734e4130-6a6f-4739-9b25-4fe9cb8561c2/volumes" Mar 08 03:12:13.629286 master-0 kubenswrapper[7648]: I0308 03:12:13.629041 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cd3a37-76c7-4158-997d-0b8b2fd81131" path="/var/lib/kubelet/pods/b0cd3a37-76c7-4158-997d-0b8b2fd81131/volumes" Mar 08 03:12:13.681922 master-0 kubenswrapper[7648]: I0308 03:12:13.681858 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:13.709775 master-0 kubenswrapper[7648]: I0308 03:12:13.709725 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.709775 master-0 kubenswrapper[7648]: I0308 03:12:13.709773 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.709860 master-0 kubenswrapper[7648]: I0308 03:12:13.709795 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885mp\" (UniqueName: \"kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.712206 master-0 kubenswrapper[7648]: I0308 03:12:13.711067 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.713029 master-0 kubenswrapper[7648]: I0308 03:12:13.712964 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.745508 master-0 kubenswrapper[7648]: I0308 03:12:13.744342 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885mp\" (UniqueName: \"kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:13.791474 master-0 kubenswrapper[7648]: I0308 03:12:13.791337 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lmqn7"] Mar 08 03:12:13.792006 master-0 kubenswrapper[7648]: I0308 03:12:13.791883 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.818511 master-0 kubenswrapper[7648]: I0308 03:12:13.813601 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xsp\" (UniqueName: \"kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.818511 master-0 kubenswrapper[7648]: I0308 03:12:13.813669 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.915750 master-0 kubenswrapper[7648]: I0308 03:12:13.915381 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xsp\" (UniqueName: \"kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.915750 master-0 kubenswrapper[7648]: I0308 03:12:13.915503 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.915991 master-0 kubenswrapper[7648]: I0308 03:12:13.915945 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.932123 master-0 kubenswrapper[7648]: I0308 03:12:13.932065 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xsp\" (UniqueName: \"kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:13.936402 master-0 kubenswrapper[7648]: I0308 03:12:13.934957 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"b9b2988e-7fa3-44ee-be58-51964231a2ab","Type":"ContainerStarted","Data":"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642"} Mar 08 03:12:13.938540 master-0 kubenswrapper[7648]: I0308 03:12:13.938507 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" event={"ID":"8f99f81a-fd2d-432e-a3bc-e451342650b1","Type":"ContainerStarted","Data":"d36e294f99faee2aa24af3661ab7c7c9c13c254d1681f2e825ed443ce4a60660"} Mar 08 03:12:13.946666 master-0 kubenswrapper[7648]: I0308 03:12:13.946259 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerStarted","Data":"86741d12d093cb6aa6a95e5bb1f80fb5356ac767f03cb7b7fe1561e3ecaecf93"} Mar 08 03:12:13.946666 master-0 kubenswrapper[7648]: I0308 03:12:13.946312 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerStarted","Data":"115308b4e38a50965cda00a6f3da9ba63adca456afd5e8dd547096a0f49ebb12"} Mar 08 03:12:13.947138 master-0 kubenswrapper[7648]: I0308 03:12:13.947107 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:13.948725 master-0 kubenswrapper[7648]: I0308 03:12:13.948694 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerStarted","Data":"95cb1ab0414f6248676ceab0da8402d36a93f6fced2ddcec794373deb0d0db80"} Mar 08 03:12:13.948725 master-0 kubenswrapper[7648]: I0308 03:12:13.948722 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerStarted","Data":"1174559eb926edc2225e560dc0dd22582ba204988cd3d4b6d4211ed46e4108ab"} Mar 08 03:12:13.948821 master-0 kubenswrapper[7648]: I0308 03:12:13.948732 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerStarted","Data":"40b87ce5bc138e32a2067ed918b783e37e64b1d585f2b0c8e8982345833631fd"} Mar 08 03:12:13.949119 master-0 kubenswrapper[7648]: I0308 03:12:13.949093 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:13.950026 master-0 kubenswrapper[7648]: I0308 03:12:13.949997 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" event={"ID":"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e","Type":"ContainerStarted","Data":"054eef96a95fe907231f464fab11b404c512c2aaa4fbea6c8dc2730252ffddc1"} Mar 08 03:12:13.961369 master-0 kubenswrapper[7648]: I0308 03:12:13.961301 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=3.961282318 podStartE2EDuration="3.961282318s" podCreationTimestamp="2026-03-08 03:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:13.955812799 +0000 UTC m=+26.567131089" watchObservedRunningTime="2026-03-08 03:12:13.961282318 +0000 UTC m=+26.572600618" Mar 08 03:12:14.022316 master-0 kubenswrapper[7648]: I0308 03:12:14.022245 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podStartSLOduration=6.022227641 podStartE2EDuration="6.022227641s" podCreationTimestamp="2026-03-08 03:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:13.995741167 +0000 UTC m=+26.607059467" watchObservedRunningTime="2026-03-08 03:12:14.022227641 +0000 UTC m=+26.633545931" Mar 08 03:12:14.032313 master-0 kubenswrapper[7648]: I0308 03:12:14.032250 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:14.051881 master-0 kubenswrapper[7648]: I0308 03:12:14.051765 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podStartSLOduration=7.05174489 podStartE2EDuration="7.05174489s" podCreationTimestamp="2026-03-08 03:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:14.050242838 +0000 UTC m=+26.661561138" watchObservedRunningTime="2026-03-08 03:12:14.05174489 +0000 UTC m=+26.663063170" Mar 08 03:12:14.157183 master-0 kubenswrapper[7648]: I0308 03:12:14.156596 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:12:14.181237 master-0 kubenswrapper[7648]: W0308 03:12:14.181184 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe0357f_dab4_4424_869c_f6070b411a35.slice/crio-aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141 WatchSource:0}: Error finding container aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141: Status 404 returned error can't find the container with id aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141 Mar 08 03:12:14.422880 master-0 kubenswrapper[7648]: I0308 03:12:14.422750 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-htnv4"] Mar 08 03:12:14.439890 master-0 kubenswrapper[7648]: W0308 03:12:14.439842 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda3bd48_6de3_49b0_b2ce_96d97e97f178.slice/crio-770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724 WatchSource:0}: Error finding container 770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724: Status 404 returned error can't find the container with id 770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724 Mar 08 03:12:14.961638 master-0 kubenswrapper[7648]: I0308 03:12:14.961582 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htnv4" event={"ID":"bda3bd48-6de3-49b0-b2ce-96d97e97f178","Type":"ContainerStarted","Data":"770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724"} Mar 08 03:12:14.969294 master-0 kubenswrapper[7648]: I0308 03:12:14.969242 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lmqn7" event={"ID":"dfe0357f-dab4-4424-869c-f6070b411a35","Type":"ContainerStarted","Data":"dd3fcc1fbba21c7ba470a00b5004954d064667b1d0e4f8004e5b9b5a402fff30"} Mar 08 03:12:14.969346 master-0 kubenswrapper[7648]: I0308 03:12:14.969297 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lmqn7" event={"ID":"dfe0357f-dab4-4424-869c-f6070b411a35","Type":"ContainerStarted","Data":"aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141"} Mar 08 03:12:14.996602 master-0 kubenswrapper[7648]: I0308 03:12:14.991098 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lmqn7" podStartSLOduration=1.9910775250000001 podStartE2EDuration="1.991077525s" podCreationTimestamp="2026-03-08 03:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:14.99091774 +0000 UTC m=+27.602236070" watchObservedRunningTime="2026-03-08 03:12:14.991077525 +0000 UTC m=+27.602395845" Mar 08 03:12:15.462735 master-0 kubenswrapper[7648]: I0308 03:12:15.460612 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:15.462735 master-0 kubenswrapper[7648]: I0308 03:12:15.461429 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.463907 master-0 kubenswrapper[7648]: I0308 03:12:15.463523 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:12:15.464267 master-0 kubenswrapper[7648]: I0308 03:12:15.464214 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:12:15.464328 master-0 kubenswrapper[7648]: I0308 03:12:15.464255 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:12:15.464380 master-0 kubenswrapper[7648]: I0308 03:12:15.464306 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:12:15.465222 master-0 kubenswrapper[7648]: I0308 03:12:15.465169 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:15.475430 master-0 kubenswrapper[7648]: I0308 03:12:15.475373 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:15.479586 master-0 kubenswrapper[7648]: I0308 03:12:15.479535 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:12:15.638726 master-0 kubenswrapper[7648]: I0308 03:12:15.638671 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.638916 master-0 kubenswrapper[7648]: I0308 03:12:15.638744 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.638994 master-0 kubenswrapper[7648]: I0308 03:12:15.638950 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqlj\" (UniqueName: \"kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.639092 master-0 kubenswrapper[7648]: I0308 03:12:15.639066 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.639147 master-0 kubenswrapper[7648]: I0308 03:12:15.639134 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.740363 master-0 kubenswrapper[7648]: I0308 03:12:15.740221 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.740363 master-0 kubenswrapper[7648]: I0308 03:12:15.740274 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.740628 master-0 kubenswrapper[7648]: I0308 03:12:15.740535 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.740673 master-0 kubenswrapper[7648]: I0308 03:12:15.740639 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.741141 master-0 kubenswrapper[7648]: I0308 03:12:15.741120 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqlj\" (UniqueName: \"kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.742237 master-0 kubenswrapper[7648]: I0308 03:12:15.742182 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.742369 master-0 kubenswrapper[7648]: I0308 03:12:15.742342 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.742617 master-0 kubenswrapper[7648]: I0308 03:12:15.742588 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.757967 master-0 kubenswrapper[7648]: I0308 03:12:15.757312 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.759002 master-0 kubenswrapper[7648]: I0308 03:12:15.758910 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqlj\" (UniqueName: \"kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj\") pod \"controller-manager-5b855446cf-f998w\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:15.788597 master-0 kubenswrapper[7648]: I0308 03:12:15.788552 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:18.245880 master-0 kubenswrapper[7648]: I0308 03:12:18.245838 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:12:18.352647 master-0 kubenswrapper[7648]: I0308 03:12:18.347615 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:19.001301 master-0 kubenswrapper[7648]: I0308 03:12:19.001219 7648 generic.go:334] "Generic (PLEG): container finished" podID="f99d6808-9fec-402d-93f7-41575a5a0a08" containerID="ff016ecc0b1406f7273a88aa6d6e16959d56705496418cbd4648458844f2bbb1" exitCode=0 Mar 08 03:12:19.001525 master-0 kubenswrapper[7648]: I0308 03:12:19.001355 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-778796f487-vzb5n" event={"ID":"f99d6808-9fec-402d-93f7-41575a5a0a08","Type":"ContainerDied","Data":"ff016ecc0b1406f7273a88aa6d6e16959d56705496418cbd4648458844f2bbb1"} Mar 08 03:12:19.007973 master-0 kubenswrapper[7648]: I0308 03:12:19.007901 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htnv4" event={"ID":"bda3bd48-6de3-49b0-b2ce-96d97e97f178","Type":"ContainerStarted","Data":"171d20d40aca9dbeea19a4727a6c7abbde402980abb71d4cca4e56736ac504c4"} Mar 08 03:12:19.007973 master-0 kubenswrapper[7648]: I0308 03:12:19.007972 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-htnv4" event={"ID":"bda3bd48-6de3-49b0-b2ce-96d97e97f178","Type":"ContainerStarted","Data":"e96191235b360c2fc7457cb5d0c2675d7cc50ab5b393895673c5ba78fd9d6a8f"} Mar 08 03:12:19.008156 master-0 kubenswrapper[7648]: I0308 03:12:19.008068 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:19.010015 master-0 kubenswrapper[7648]: I0308 03:12:19.009958 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" event={"ID":"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e","Type":"ContainerStarted","Data":"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a"} Mar 08 03:12:19.010716 master-0 kubenswrapper[7648]: I0308 03:12:19.010672 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:19.012879 master-0 kubenswrapper[7648]: I0308 03:12:19.012804 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" event={"ID":"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c","Type":"ContainerStarted","Data":"ad8221884944f8d23c4ee1643b7a86a37022bffdd15185cbce466ba1d7ac80d8"} Mar 08 03:12:19.031733 master-0 kubenswrapper[7648]: I0308 03:12:19.028090 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:19.056009 master-0 kubenswrapper[7648]: I0308 03:12:19.055906 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-htnv4" podStartSLOduration=2.546566294 podStartE2EDuration="6.055873896s" podCreationTimestamp="2026-03-08 03:12:13 +0000 UTC" firstStartedPulling="2026-03-08 03:12:14.443104475 +0000 UTC m=+27.054422765" lastFinishedPulling="2026-03-08 03:12:17.952412077 +0000 UTC m=+30.563730367" observedRunningTime="2026-03-08 03:12:19.052415277 +0000 UTC m=+31.663733607" watchObservedRunningTime="2026-03-08 03:12:19.055873896 +0000 UTC m=+31.667192226" Mar 08 03:12:19.084988 master-0 kubenswrapper[7648]: I0308 03:12:19.082318 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" podStartSLOduration=3.855280049 podStartE2EDuration="8.082294558s" podCreationTimestamp="2026-03-08 03:12:11 +0000 UTC" firstStartedPulling="2026-03-08 03:12:13.709093515 +0000 UTC m=+26.320411805" lastFinishedPulling="2026-03-08 03:12:17.936108014 +0000 UTC m=+30.547426314" observedRunningTime="2026-03-08 03:12:19.077226153 +0000 UTC m=+31.688544493" watchObservedRunningTime="2026-03-08 03:12:19.082294558 +0000 UTC m=+31.693612888" Mar 08 03:12:19.808130 master-0 kubenswrapper[7648]: I0308 03:12:19.807987 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:19.808746 master-0 kubenswrapper[7648]: I0308 03:12:19.808180 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="b9b2988e-7fa3-44ee-be58-51964231a2ab" containerName="installer" containerID="cri-o://6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642" gracePeriod=30 Mar 08 03:12:20.023254 master-0 kubenswrapper[7648]: I0308 03:12:20.023165 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-778796f487-vzb5n" event={"ID":"f99d6808-9fec-402d-93f7-41575a5a0a08","Type":"ContainerStarted","Data":"9f3ae4e68bea1bf295a27da84078d342012e563ce9efcce3a615bbcd200897e7"} Mar 08 03:12:20.023254 master-0 kubenswrapper[7648]: I0308 03:12:20.023263 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-778796f487-vzb5n" event={"ID":"f99d6808-9fec-402d-93f7-41575a5a0a08","Type":"ContainerStarted","Data":"216428309e601476bdb59b392e425becef97b4c9b844c72f2fe29df7e4d02e35"} Mar 08 03:12:20.050505 master-0 kubenswrapper[7648]: I0308 03:12:20.046730 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-778796f487-vzb5n" podStartSLOduration=6.943260604 podStartE2EDuration="12.046712218s" podCreationTimestamp="2026-03-08 03:12:08 +0000 UTC" firstStartedPulling="2026-03-08 03:12:12.857310351 +0000 UTC m=+25.468628641" lastFinishedPulling="2026-03-08 03:12:17.960761965 +0000 UTC m=+30.572080255" observedRunningTime="2026-03-08 03:12:20.045720014 +0000 UTC m=+32.657038324" watchObservedRunningTime="2026-03-08 03:12:20.046712218 +0000 UTC m=+32.658030508" Mar 08 03:12:20.195504 master-0 kubenswrapper[7648]: I0308 03:12:20.194768 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n"] Mar 08 03:12:20.195504 master-0 kubenswrapper[7648]: I0308 03:12:20.195427 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201002 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201080 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201315 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201344 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201437 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201461 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201532 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:12:20.202502 master-0 kubenswrapper[7648]: I0308 03:12:20.201567 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:12:20.230505 master-0 kubenswrapper[7648]: I0308 03:12:20.226966 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n"] Mar 08 03:12:20.282138 master-0 kubenswrapper[7648]: I0308 03:12:20.282082 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:20.284589 master-0 kubenswrapper[7648]: I0308 03:12:20.282568 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: I0308 03:12:20.300581 7648 patch_prober.go:28] interesting pod/apiserver-778796f487-vzb5n container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]log ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]etcd ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/max-in-flight-filter ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/project.openshift.io-projectcache ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-startinformers ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 03:12:20.300737 master-0 kubenswrapper[7648]: livez check failed Mar 08 03:12:20.301306 master-0 kubenswrapper[7648]: I0308 03:12:20.300652 7648 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-778796f487-vzb5n" podUID="f99d6808-9fec-402d-93f7-41575a5a0a08" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 03:12:20.313144 master-0 kubenswrapper[7648]: I0308 03:12:20.313094 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313160 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313199 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313217 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntd2k\" (UniqueName: \"kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313236 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313271 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313301 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.313339 master-0 kubenswrapper[7648]: I0308 03:12:20.313327 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.414197 master-0 kubenswrapper[7648]: I0308 03:12:20.414147 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:20.414444 master-0 kubenswrapper[7648]: I0308 03:12:20.414425 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.414547 master-0 kubenswrapper[7648]: I0308 03:12:20.414531 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:12:20.414624 master-0 kubenswrapper[7648]: I0308 03:12:20.414612 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:20.414706 master-0 kubenswrapper[7648]: I0308 03:12:20.414695 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.414786 master-0 kubenswrapper[7648]: I0308 03:12:20.414773 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.414855 master-0 kubenswrapper[7648]: I0308 03:12:20.414843 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:20.414941 master-0 kubenswrapper[7648]: I0308 03:12:20.414928 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.415019 master-0 kubenswrapper[7648]: I0308 03:12:20.415007 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntd2k\" (UniqueName: \"kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.415093 master-0 kubenswrapper[7648]: I0308 03:12:20.415082 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.415172 master-0 kubenswrapper[7648]: I0308 03:12:20.415161 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.415236 master-0 kubenswrapper[7648]: I0308 03:12:20.415225 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:12:20.415316 master-0 kubenswrapper[7648]: I0308 03:12:20.415301 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.415467 master-0 kubenswrapper[7648]: I0308 03:12:20.415453 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.418547 master-0 kubenswrapper[7648]: I0308 03:12:20.418503 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.418715 master-0 kubenswrapper[7648]: I0308 03:12:20.418681 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.419672 master-0 kubenswrapper[7648]: I0308 03:12:20.419639 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:20.420998 master-0 kubenswrapper[7648]: I0308 03:12:20.420377 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.420998 master-0 kubenswrapper[7648]: I0308 03:12:20.420686 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:20.421123 master-0 kubenswrapper[7648]: I0308 03:12:20.421093 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.422763 master-0 kubenswrapper[7648]: I0308 03:12:20.421926 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.422763 master-0 kubenswrapper[7648]: I0308 03:12:20.422315 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.423572 master-0 kubenswrapper[7648]: I0308 03:12:20.423554 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:20.425130 master-0 kubenswrapper[7648]: I0308 03:12:20.425093 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:12:20.426341 master-0 kubenswrapper[7648]: I0308 03:12:20.426300 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:12:20.453644 master-0 kubenswrapper[7648]: I0308 03:12:20.449750 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntd2k\" (UniqueName: \"kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.516618 master-0 kubenswrapper[7648]: I0308 03:12:20.516568 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:12:20.516776 master-0 kubenswrapper[7648]: I0308 03:12:20.516685 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:12:20.530239 master-0 kubenswrapper[7648]: I0308 03:12:20.530202 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:12:20.531189 master-0 kubenswrapper[7648]: I0308 03:12:20.531160 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:12:20.575350 master-0 kubenswrapper[7648]: I0308 03:12:20.573698 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:20.673717 master-0 kubenswrapper[7648]: I0308 03:12:20.671789 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:12:20.673717 master-0 kubenswrapper[7648]: I0308 03:12:20.673528 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:20.677850 master-0 kubenswrapper[7648]: I0308 03:12:20.677454 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:12:20.677850 master-0 kubenswrapper[7648]: I0308 03:12:20.677492 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:20.677850 master-0 kubenswrapper[7648]: I0308 03:12:20.677757 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:12:20.677850 master-0 kubenswrapper[7648]: I0308 03:12:20.677835 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:12:20.678081 master-0 kubenswrapper[7648]: I0308 03:12:20.678061 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:20.999560 master-0 kubenswrapper[7648]: I0308 03:12:20.999497 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n"] Mar 08 03:12:21.003094 master-0 kubenswrapper[7648]: W0308 03:12:21.003057 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac2b210_2fbb_4d25_a0ea_1825259cee3b.slice/crio-ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6 WatchSource:0}: Error finding container ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6: Status 404 returned error can't find the container with id ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6 Mar 08 03:12:21.040444 master-0 kubenswrapper[7648]: I0308 03:12:21.040386 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" event={"ID":"dac2b210-2fbb-4d25-a0ea-1825259cee3b","Type":"ContainerStarted","Data":"ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6"} Mar 08 03:12:21.136696 master-0 kubenswrapper[7648]: I0308 03:12:21.136636 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj"] Mar 08 03:12:21.175957 master-0 kubenswrapper[7648]: I0308 03:12:21.174932 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jl9tj"] Mar 08 03:12:21.182340 master-0 kubenswrapper[7648]: I0308 03:12:21.182298 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775"] Mar 08 03:12:21.184107 master-0 kubenswrapper[7648]: I0308 03:12:21.184083 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:12:21.186181 master-0 kubenswrapper[7648]: I0308 03:12:21.186058 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf"] Mar 08 03:12:21.346820 master-0 kubenswrapper[7648]: I0308 03:12:21.346774 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg"] Mar 08 03:12:21.448020 master-0 kubenswrapper[7648]: I0308 03:12:21.447903 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw"] Mar 08 03:12:22.212230 master-0 kubenswrapper[7648]: I0308 03:12:22.212159 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:22.213162 master-0 kubenswrapper[7648]: I0308 03:12:22.213119 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.217110 master-0 kubenswrapper[7648]: I0308 03:12:22.217052 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:22.227688 master-0 kubenswrapper[7648]: I0308 03:12:22.227227 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:12:22.260676 master-0 kubenswrapper[7648]: I0308 03:12:22.259910 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.260676 master-0 kubenswrapper[7648]: I0308 03:12:22.259964 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.260676 master-0 kubenswrapper[7648]: I0308 03:12:22.260000 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.360897 master-0 kubenswrapper[7648]: I0308 03:12:22.360831 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.361094 master-0 kubenswrapper[7648]: I0308 03:12:22.360924 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.361094 master-0 kubenswrapper[7648]: I0308 03:12:22.360960 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.361094 master-0 kubenswrapper[7648]: I0308 03:12:22.361052 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.361241 master-0 kubenswrapper[7648]: I0308 03:12:22.361101 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.381709 master-0 kubenswrapper[7648]: I0308 03:12:22.381671 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access\") pod \"installer-2-master-0\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.505256 master-0 kubenswrapper[7648]: W0308 03:12:22.505224 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb1fd59_5e3e_4711_83cf_c5cf2ec7622c.slice/crio-d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100 WatchSource:0}: Error finding container d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100: Status 404 returned error can't find the container with id d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100 Mar 08 03:12:22.538785 master-0 kubenswrapper[7648]: I0308 03:12:22.538732 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:22.960016 master-0 kubenswrapper[7648]: I0308 03:12:22.959439 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:23.078215 master-0 kubenswrapper[7648]: I0308 03:12:23.077724 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" event={"ID":"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c","Type":"ContainerStarted","Data":"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a"} Mar 08 03:12:23.078995 master-0 kubenswrapper[7648]: I0308 03:12:23.078970 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:23.086701 master-0 kubenswrapper[7648]: I0308 03:12:23.086656 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl9tj" event={"ID":"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c","Type":"ContainerStarted","Data":"d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100"} Mar 08 03:12:23.094541 master-0 kubenswrapper[7648]: I0308 03:12:23.094278 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"534f9589-9c0c-48aa-a5a8-fddf36d2d562","Type":"ContainerStarted","Data":"8105f30e57393894ce687c653037e150aaf5e1d89aca93b913e5f23cc6aca59e"} Mar 08 03:12:23.097049 master-0 kubenswrapper[7648]: I0308 03:12:23.097011 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" event={"ID":"4f822854-b9ac-46f2-b03b-e7215fba9208","Type":"ContainerStarted","Data":"e023598af62616102fc3da25dddc7bed12c4ad58ecf15ebabad27596e663a5e7"} Mar 08 03:12:23.097118 master-0 kubenswrapper[7648]: I0308 03:12:23.097106 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:23.110531 master-0 kubenswrapper[7648]: I0308 03:12:23.107964 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" podStartSLOduration=7.918018088 podStartE2EDuration="12.107947817s" podCreationTimestamp="2026-03-08 03:12:11 +0000 UTC" firstStartedPulling="2026-03-08 03:12:18.36862299 +0000 UTC m=+30.979941300" lastFinishedPulling="2026-03-08 03:12:22.558552739 +0000 UTC m=+35.169871029" observedRunningTime="2026-03-08 03:12:23.099501345 +0000 UTC m=+35.710819635" watchObservedRunningTime="2026-03-08 03:12:23.107947817 +0000 UTC m=+35.719266097" Mar 08 03:12:23.110531 master-0 kubenswrapper[7648]: I0308 03:12:23.108313 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" event={"ID":"4108f513-acef-473a-ab03-f3761b2bd0d8","Type":"ContainerStarted","Data":"3b71760d7242393a221c7ae1af331545931cbaec81501f72daac6ae1c2882487"} Mar 08 03:12:23.110531 master-0 kubenswrapper[7648]: I0308 03:12:23.110180 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" event={"ID":"d83aa242-606f-4adc-b689-4aa89625b533","Type":"ContainerStarted","Data":"cb95d850788ef393eb0f1ea09b2f5ec3ff3892998a30374932e76aea89c669e6"} Mar 08 03:12:23.112765 master-0 kubenswrapper[7648]: I0308 03:12:23.111988 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerStarted","Data":"0d9abd668d0f5e4396724f5ea282e6dd1f64c0edb81c83294ae09a514ba683b4"} Mar 08 03:12:23.115877 master-0 kubenswrapper[7648]: I0308 03:12:23.115780 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerStarted","Data":"04168952ada741f79304ee9b25e1212567fc1ce3d719a0050a26b711accbbea4"} Mar 08 03:12:23.120769 master-0 kubenswrapper[7648]: I0308 03:12:23.119596 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerStarted","Data":"9cdd38315c94b7acef97e2bb04676f46ccddeedba19b5c454b47f2aa2d19e0b4"} Mar 08 03:12:23.120769 master-0 kubenswrapper[7648]: I0308 03:12:23.119641 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerStarted","Data":"e9fad8077e3e386a60b5dc7bb7e5c6bd154a1e1fbbbc76393683792778d9fac5"} Mar 08 03:12:24.136673 master-0 kubenswrapper[7648]: I0308 03:12:24.136439 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"534f9589-9c0c-48aa-a5a8-fddf36d2d562","Type":"ContainerStarted","Data":"d5b621a3334abe2132d33e21101dd76b1c492821c93abc7d97ebb298b5cc4bfc"} Mar 08 03:12:24.151527 master-0 kubenswrapper[7648]: I0308 03:12:24.151398 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=2.151380815 podStartE2EDuration="2.151380815s" podCreationTimestamp="2026-03-08 03:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:24.149160988 +0000 UTC m=+36.760479278" watchObservedRunningTime="2026-03-08 03:12:24.151380815 +0000 UTC m=+36.762699105" Mar 08 03:12:25.289080 master-0 kubenswrapper[7648]: I0308 03:12:25.289029 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:25.295594 master-0 kubenswrapper[7648]: I0308 03:12:25.295493 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:12:26.153798 master-0 kubenswrapper[7648]: I0308 03:12:26.153561 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:12:26.154137 master-0 kubenswrapper[7648]: I0308 03:12:26.154115 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.155822 master-0 kubenswrapper[7648]: I0308 03:12:26.155764 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-gcdsk" Mar 08 03:12:26.156603 master-0 kubenswrapper[7648]: I0308 03:12:26.155945 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:12:26.162816 master-0 kubenswrapper[7648]: I0308 03:12:26.162693 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:12:26.241394 master-0 kubenswrapper[7648]: I0308 03:12:26.241351 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.241617 master-0 kubenswrapper[7648]: I0308 03:12:26.241496 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.241617 master-0 kubenswrapper[7648]: I0308 03:12:26.241538 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.343360 master-0 kubenswrapper[7648]: I0308 03:12:26.343313 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.343360 master-0 kubenswrapper[7648]: I0308 03:12:26.343366 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.343867 master-0 kubenswrapper[7648]: I0308 03:12:26.343400 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.343867 master-0 kubenswrapper[7648]: I0308 03:12:26.343474 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.343867 master-0 kubenswrapper[7648]: I0308 03:12:26.343771 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.364006 master-0 kubenswrapper[7648]: I0308 03:12:26.363940 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:26.517674 master-0 kubenswrapper[7648]: I0308 03:12:26.517093 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:12:27.458042 master-0 kubenswrapper[7648]: I0308 03:12:27.457995 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:12:27.727447 master-0 kubenswrapper[7648]: I0308 03:12:27.725596 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq"] Mar 08 03:12:27.727447 master-0 kubenswrapper[7648]: I0308 03:12:27.726450 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" podUID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" containerName="cluster-version-operator" containerID="cri-o://b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b" gracePeriod=130 Mar 08 03:12:28.932517 master-0 kubenswrapper[7648]: I0308 03:12:28.932474 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:12:29.036544 master-0 kubenswrapper[7648]: I0308 03:12:29.036508 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-htnv4" Mar 08 03:12:29.083881 master-0 kubenswrapper[7648]: I0308 03:12:29.083755 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") pod \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " Mar 08 03:12:29.084810 master-0 kubenswrapper[7648]: I0308 03:12:29.084776 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") pod \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " Mar 08 03:12:29.084951 master-0 kubenswrapper[7648]: I0308 03:12:29.084929 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") pod \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " Mar 08 03:12:29.085004 master-0 kubenswrapper[7648]: I0308 03:12:29.084978 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") pod \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " Mar 08 03:12:29.085702 master-0 kubenswrapper[7648]: I0308 03:12:29.085613 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") pod \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\" (UID: \"c4af87e2-50c3-4d08-9326-9c8876a6fd7b\") " Mar 08 03:12:29.087572 master-0 kubenswrapper[7648]: I0308 03:12:29.087546 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "c4af87e2-50c3-4d08-9326-9c8876a6fd7b" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:29.087624 master-0 kubenswrapper[7648]: I0308 03:12:29.087573 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "c4af87e2-50c3-4d08-9326-9c8876a6fd7b" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:29.088471 master-0 kubenswrapper[7648]: I0308 03:12:29.088440 7648 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:29.088628 master-0 kubenswrapper[7648]: I0308 03:12:29.088471 7648 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:29.089378 master-0 kubenswrapper[7648]: I0308 03:12:29.089350 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca" (OuterVolumeSpecName: "service-ca") pod "c4af87e2-50c3-4d08-9326-9c8876a6fd7b" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:29.094589 master-0 kubenswrapper[7648]: I0308 03:12:29.093876 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4af87e2-50c3-4d08-9326-9c8876a6fd7b" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:29.098276 master-0 kubenswrapper[7648]: I0308 03:12:29.097984 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c4af87e2-50c3-4d08-9326-9c8876a6fd7b" (UID: "c4af87e2-50c3-4d08-9326-9c8876a6fd7b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:29.180953 master-0 kubenswrapper[7648]: I0308 03:12:29.180903 7648 generic.go:334] "Generic (PLEG): container finished" podID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" containerID="b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b" exitCode=0 Mar 08 03:12:29.180953 master-0 kubenswrapper[7648]: I0308 03:12:29.180944 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" event={"ID":"c4af87e2-50c3-4d08-9326-9c8876a6fd7b","Type":"ContainerDied","Data":"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b"} Mar 08 03:12:29.181148 master-0 kubenswrapper[7648]: I0308 03:12:29.180980 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" event={"ID":"c4af87e2-50c3-4d08-9326-9c8876a6fd7b","Type":"ContainerDied","Data":"60f35b3e196e060290be05c8adcbfb7dc922d8c1abdf61e0112a61ef38f0180d"} Mar 08 03:12:29.181148 master-0 kubenswrapper[7648]: I0308 03:12:29.180997 7648 scope.go:117] "RemoveContainer" containerID="b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b" Mar 08 03:12:29.181148 master-0 kubenswrapper[7648]: I0308 03:12:29.180926 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq" Mar 08 03:12:29.191778 master-0 kubenswrapper[7648]: I0308 03:12:29.191044 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" event={"ID":"4108f513-acef-473a-ab03-f3761b2bd0d8","Type":"ContainerStarted","Data":"037a157ecdddcd6f9c4ef440e42fe2074521ea1ed5682104cc6d822251b3efc0"} Mar 08 03:12:29.191778 master-0 kubenswrapper[7648]: I0308 03:12:29.191772 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:29.191778 master-0 kubenswrapper[7648]: I0308 03:12:29.191797 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:29.192023 master-0 kubenswrapper[7648]: I0308 03:12:29.191808 7648 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4af87e2-50c3-4d08-9326-9c8876a6fd7b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:29.195112 master-0 kubenswrapper[7648]: I0308 03:12:29.193137 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerStarted","Data":"df3e8baabefc90e04c02f0f45ed7aa89841f1f4954012b9c683b090559c5e516"} Mar 08 03:12:29.195112 master-0 kubenswrapper[7648]: I0308 03:12:29.193841 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:29.196548 master-0 kubenswrapper[7648]: I0308 03:12:29.196523 7648 scope.go:117] "RemoveContainer" containerID="b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b" Mar 08 03:12:29.200701 master-0 kubenswrapper[7648]: I0308 03:12:29.197284 7648 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-7hsbf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" start-of-body= Mar 08 03:12:29.200701 master-0 kubenswrapper[7648]: I0308 03:12:29.197321 7648 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" Mar 08 03:12:29.200701 master-0 kubenswrapper[7648]: E0308 03:12:29.199925 7648 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b\": container with ID starting with b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b not found: ID does not exist" containerID="b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b" Mar 08 03:12:29.200701 master-0 kubenswrapper[7648]: I0308 03:12:29.199954 7648 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b"} err="failed to get container status \"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b\": rpc error: code = NotFound desc = could not find container \"b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b\": container with ID starting with b13bb8d8f28dfac397f346ebaa53f2a0b8bfdba2b0132246173ba6906d22930b not found: ID does not exist" Mar 08 03:12:29.235294 master-0 kubenswrapper[7648]: I0308 03:12:29.235249 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq"] Mar 08 03:12:29.240060 master-0 kubenswrapper[7648]: I0308 03:12:29.240010 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-f64sq"] Mar 08 03:12:29.291464 master-0 kubenswrapper[7648]: I0308 03:12:29.289761 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v"] Mar 08 03:12:29.293670 master-0 kubenswrapper[7648]: E0308 03:12:29.293614 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" containerName="cluster-version-operator" Mar 08 03:12:29.293670 master-0 kubenswrapper[7648]: I0308 03:12:29.293673 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" containerName="cluster-version-operator" Mar 08 03:12:29.293934 master-0 kubenswrapper[7648]: I0308 03:12:29.293914 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" containerName="cluster-version-operator" Mar 08 03:12:29.295594 master-0 kubenswrapper[7648]: I0308 03:12:29.294457 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.297165 master-0 kubenswrapper[7648]: I0308 03:12:29.297077 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:12:29.297800 master-0 kubenswrapper[7648]: I0308 03:12:29.297335 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:12:29.297800 master-0 kubenswrapper[7648]: I0308 03:12:29.297640 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-r5m92" Mar 08 03:12:29.297800 master-0 kubenswrapper[7648]: I0308 03:12:29.297756 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:12:29.346933 master-0 kubenswrapper[7648]: I0308 03:12:29.346883 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 08 03:12:29.395359 master-0 kubenswrapper[7648]: I0308 03:12:29.394772 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.395359 master-0 kubenswrapper[7648]: I0308 03:12:29.394809 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.395359 master-0 kubenswrapper[7648]: I0308 03:12:29.394829 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.395359 master-0 kubenswrapper[7648]: I0308 03:12:29.394901 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.395359 master-0 kubenswrapper[7648]: I0308 03:12:29.394941 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496181 master-0 kubenswrapper[7648]: I0308 03:12:29.496127 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496367 master-0 kubenswrapper[7648]: I0308 03:12:29.496203 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496367 master-0 kubenswrapper[7648]: I0308 03:12:29.496225 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496367 master-0 kubenswrapper[7648]: I0308 03:12:29.496242 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496367 master-0 kubenswrapper[7648]: I0308 03:12:29.496261 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496663 master-0 kubenswrapper[7648]: I0308 03:12:29.496561 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.496663 master-0 kubenswrapper[7648]: I0308 03:12:29.496637 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.497675 master-0 kubenswrapper[7648]: I0308 03:12:29.497652 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.507014 master-0 kubenswrapper[7648]: I0308 03:12:29.506979 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.520516 master-0 kubenswrapper[7648]: I0308 03:12:29.520270 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.619380 master-0 kubenswrapper[7648]: I0308 03:12:29.619325 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:12:29.623530 master-0 kubenswrapper[7648]: I0308 03:12:29.623474 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4af87e2-50c3-4d08-9326-9c8876a6fd7b" path="/var/lib/kubelet/pods/c4af87e2-50c3-4d08-9326-9c8876a6fd7b/volumes" Mar 08 03:12:30.199045 master-0 kubenswrapper[7648]: I0308 03:12:30.198664 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" event={"ID":"af653e87-ce5f-4f1a-a20d-233c563694ba","Type":"ContainerStarted","Data":"6c59d77b77a1f89b306ddf4cc0f2bd1da0d815a10de107029f05b136ace17ea9"} Mar 08 03:12:30.199045 master-0 kubenswrapper[7648]: I0308 03:12:30.198722 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" event={"ID":"af653e87-ce5f-4f1a-a20d-233c563694ba","Type":"ContainerStarted","Data":"232775f5a2d5493c0a82abf166454589f3f2855c9d7aba021d33f9d3267ef323"} Mar 08 03:12:30.201164 master-0 kubenswrapper[7648]: I0308 03:12:30.201132 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl9tj" event={"ID":"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c","Type":"ContainerStarted","Data":"fdc94ea6c6f5ba95dc82f217d55f088a1f03975fae76cb0bf5f1405f67eec0ab"} Mar 08 03:12:30.201164 master-0 kubenswrapper[7648]: I0308 03:12:30.201161 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jl9tj" event={"ID":"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c","Type":"ContainerStarted","Data":"a2d05d733b9cd21bd96323cf58d54a76be506c9f8b2f57ee05d9c280d9c5d91e"} Mar 08 03:12:30.206251 master-0 kubenswrapper[7648]: I0308 03:12:30.203411 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerStarted","Data":"29c6ed5b13bfb915384e6141f8cbf16cba543eb6524f87e0bd97e324ceae1c63"} Mar 08 03:12:30.206251 master-0 kubenswrapper[7648]: I0308 03:12:30.203448 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerStarted","Data":"bf7afb690bf11b8a7c9ce9f568adbdaaa57866a3aff5ced1711ca0a11620089f"} Mar 08 03:12:30.206251 master-0 kubenswrapper[7648]: I0308 03:12:30.204553 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"7ea81472-8a81-45ec-a07d-8710f47a927d","Type":"ContainerStarted","Data":"d57e1157c7569d934ea76665ae63811243fb6a6eb902e18c216d3947853ca6e4"} Mar 08 03:12:30.206251 master-0 kubenswrapper[7648]: I0308 03:12:30.204593 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"7ea81472-8a81-45ec-a07d-8710f47a927d","Type":"ContainerStarted","Data":"3b767f72fbe851c0148683712fba4f0872103808c8eb0533886fa5261badacc5"} Mar 08 03:12:30.207298 master-0 kubenswrapper[7648]: I0308 03:12:30.207269 7648 generic.go:334] "Generic (PLEG): container finished" podID="dac2b210-2fbb-4d25-a0ea-1825259cee3b" containerID="00745b525b8dc575694c9918d1d6d5efb6d0a2ce9d5450f6f2b8a338bdae4d1c" exitCode=0 Mar 08 03:12:30.207834 master-0 kubenswrapper[7648]: I0308 03:12:30.207756 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" event={"ID":"dac2b210-2fbb-4d25-a0ea-1825259cee3b","Type":"ContainerDied","Data":"00745b525b8dc575694c9918d1d6d5efb6d0a2ce9d5450f6f2b8a338bdae4d1c"} Mar 08 03:12:30.213119 master-0 kubenswrapper[7648]: I0308 03:12:30.213065 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:12:30.231615 master-0 kubenswrapper[7648]: I0308 03:12:30.231103 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" podStartSLOduration=1.231079147 podStartE2EDuration="1.231079147s" podCreationTimestamp="2026-03-08 03:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:30.228768157 +0000 UTC m=+42.840086447" watchObservedRunningTime="2026-03-08 03:12:30.231079147 +0000 UTC m=+42.842397437" Mar 08 03:12:30.760512 master-0 kubenswrapper[7648]: I0308 03:12:30.756197 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:30.760512 master-0 kubenswrapper[7648]: I0308 03:12:30.756523 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" containerName="installer" containerID="cri-o://d5b621a3334abe2132d33e21101dd76b1c492821c93abc7d97ebb298b5cc4bfc" gracePeriod=30 Mar 08 03:12:30.815716 master-0 kubenswrapper[7648]: I0308 03:12:30.815076 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=4.81505775 podStartE2EDuration="4.81505775s" podCreationTimestamp="2026-03-08 03:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:30.814017464 +0000 UTC m=+43.425335754" watchObservedRunningTime="2026-03-08 03:12:30.81505775 +0000 UTC m=+43.426376040" Mar 08 03:12:31.117504 master-0 kubenswrapper[7648]: I0308 03:12:31.109198 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:12:31.117504 master-0 kubenswrapper[7648]: I0308 03:12:31.109952 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.117504 master-0 kubenswrapper[7648]: I0308 03:12:31.112536 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ks2rl" Mar 08 03:12:31.117504 master-0 kubenswrapper[7648]: I0308 03:12:31.112693 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:31.130528 master-0 kubenswrapper[7648]: I0308 03:12:31.126762 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.210188 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.210242 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.210307 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-var-lock\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.220624 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_534f9589-9c0c-48aa-a5a8-fddf36d2d562/installer/0.log" Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.220702 7648 generic.go:334] "Generic (PLEG): container finished" podID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" containerID="d5b621a3334abe2132d33e21101dd76b1c492821c93abc7d97ebb298b5cc4bfc" exitCode=1 Mar 08 03:12:31.221675 master-0 kubenswrapper[7648]: I0308 03:12:31.220762 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"534f9589-9c0c-48aa-a5a8-fddf36d2d562","Type":"ContainerDied","Data":"d5b621a3334abe2132d33e21101dd76b1c492821c93abc7d97ebb298b5cc4bfc"} Mar 08 03:12:31.223871 master-0 kubenswrapper[7648]: I0308 03:12:31.223745 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" event={"ID":"dac2b210-2fbb-4d25-a0ea-1825259cee3b","Type":"ContainerStarted","Data":"880879806a4e2c128f227860807d56da10b673d5b79dedc95b1c518adca59839"} Mar 08 03:12:31.312098 master-0 kubenswrapper[7648]: I0308 03:12:31.311911 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.312098 master-0 kubenswrapper[7648]: I0308 03:12:31.311969 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.312098 master-0 kubenswrapper[7648]: I0308 03:12:31.312060 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-var-lock\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.312345 master-0 kubenswrapper[7648]: I0308 03:12:31.312182 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-var-lock\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.312345 master-0 kubenswrapper[7648]: I0308 03:12:31.312239 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e81d3c37-e8d7-44c8-973e-13992380ce85-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.328472 master-0 kubenswrapper[7648]: I0308 03:12:31.328425 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.424954 master-0 kubenswrapper[7648]: I0308 03:12:31.424528 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:12:31.828981 master-0 kubenswrapper[7648]: I0308 03:12:31.827836 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" podStartSLOduration=3.94116799 podStartE2EDuration="11.827818729s" podCreationTimestamp="2026-03-08 03:12:20 +0000 UTC" firstStartedPulling="2026-03-08 03:12:21.004650595 +0000 UTC m=+33.615968885" lastFinishedPulling="2026-03-08 03:12:28.891301344 +0000 UTC m=+41.502619624" observedRunningTime="2026-03-08 03:12:31.251986158 +0000 UTC m=+43.863304448" watchObservedRunningTime="2026-03-08 03:12:31.827818729 +0000 UTC m=+44.439137019" Mar 08 03:12:31.828981 master-0 kubenswrapper[7648]: I0308 03:12:31.828337 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:31.828981 master-0 kubenswrapper[7648]: I0308 03:12:31.828523 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" podUID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" containerName="controller-manager" containerID="cri-o://55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a" gracePeriod=30 Mar 08 03:12:31.869554 master-0 kubenswrapper[7648]: I0308 03:12:31.865926 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:31.869554 master-0 kubenswrapper[7648]: I0308 03:12:31.866134 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerName="route-controller-manager" containerID="cri-o://e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a" gracePeriod=30 Mar 08 03:12:33.046514 master-0 kubenswrapper[7648]: I0308 03:12:33.045760 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:33.047249 master-0 kubenswrapper[7648]: I0308 03:12:33.046638 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.051507 master-0 kubenswrapper[7648]: I0308 03:12:33.048376 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-gsmhw" Mar 08 03:12:33.081357 master-0 kubenswrapper[7648]: I0308 03:12:33.081291 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:33.148907 master-0 kubenswrapper[7648]: I0308 03:12:33.148838 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.148907 master-0 kubenswrapper[7648]: I0308 03:12:33.148887 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.148907 master-0 kubenswrapper[7648]: I0308 03:12:33.148910 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.254595 master-0 kubenswrapper[7648]: I0308 03:12:33.249884 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.254595 master-0 kubenswrapper[7648]: I0308 03:12:33.249930 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.254595 master-0 kubenswrapper[7648]: I0308 03:12:33.249947 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.254595 master-0 kubenswrapper[7648]: I0308 03:12:33.250194 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.254595 master-0 kubenswrapper[7648]: I0308 03:12:33.250257 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.270627 master-0 kubenswrapper[7648]: I0308 03:12:33.270469 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.365023 master-0 kubenswrapper[7648]: I0308 03:12:33.364948 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:33.394444 master-0 kubenswrapper[7648]: I0308 03:12:33.394019 7648 patch_prober.go:28] interesting pod/route-controller-manager-5874dc4b9-7nmkc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" start-of-body= Mar 08 03:12:33.394444 master-0 kubenswrapper[7648]: I0308 03:12:33.394117 7648 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" Mar 08 03:12:34.495567 master-0 kubenswrapper[7648]: I0308 03:12:34.495530 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_534f9589-9c0c-48aa-a5a8-fddf36d2d562/installer/0.log" Mar 08 03:12:34.495972 master-0 kubenswrapper[7648]: I0308 03:12:34.495953 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:34.572085 master-0 kubenswrapper[7648]: I0308 03:12:34.571764 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access\") pod \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " Mar 08 03:12:34.572085 master-0 kubenswrapper[7648]: I0308 03:12:34.571853 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir\") pod \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " Mar 08 03:12:34.572085 master-0 kubenswrapper[7648]: I0308 03:12:34.571910 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock\") pod \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\" (UID: \"534f9589-9c0c-48aa-a5a8-fddf36d2d562\") " Mar 08 03:12:34.572384 master-0 kubenswrapper[7648]: I0308 03:12:34.572245 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock" (OuterVolumeSpecName: "var-lock") pod "534f9589-9c0c-48aa-a5a8-fddf36d2d562" (UID: "534f9589-9c0c-48aa-a5a8-fddf36d2d562"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:34.572384 master-0 kubenswrapper[7648]: I0308 03:12:34.572290 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "534f9589-9c0c-48aa-a5a8-fddf36d2d562" (UID: "534f9589-9c0c-48aa-a5a8-fddf36d2d562"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:34.583499 master-0 kubenswrapper[7648]: I0308 03:12:34.579730 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "534f9589-9c0c-48aa-a5a8-fddf36d2d562" (UID: "534f9589-9c0c-48aa-a5a8-fddf36d2d562"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:34.673154 master-0 kubenswrapper[7648]: I0308 03:12:34.672919 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:34.673154 master-0 kubenswrapper[7648]: I0308 03:12:34.672954 7648 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:34.673154 master-0 kubenswrapper[7648]: I0308 03:12:34.672964 7648 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/534f9589-9c0c-48aa-a5a8-fddf36d2d562-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:34.856295 master-0 kubenswrapper[7648]: I0308 03:12:34.856111 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:34.905439 master-0 kubenswrapper[7648]: I0308 03:12:34.904353 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:34.976622 master-0 kubenswrapper[7648]: I0308 03:12:34.976416 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca\") pod \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " Mar 08 03:12:34.976622 master-0 kubenswrapper[7648]: I0308 03:12:34.976553 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmlx5\" (UniqueName: \"kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5\") pod \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " Mar 08 03:12:34.976622 master-0 kubenswrapper[7648]: I0308 03:12:34.976593 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert\") pod \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " Mar 08 03:12:34.976878 master-0 kubenswrapper[7648]: I0308 03:12:34.976629 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config\") pod \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\" (UID: \"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e\") " Mar 08 03:12:34.977510 master-0 kubenswrapper[7648]: I0308 03:12:34.977443 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config" (OuterVolumeSpecName: "config") pod "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" (UID: "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:34.977952 master-0 kubenswrapper[7648]: I0308 03:12:34.977918 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" (UID: "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:34.983117 master-0 kubenswrapper[7648]: I0308 03:12:34.983075 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" (UID: "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:34.993229 master-0 kubenswrapper[7648]: I0308 03:12:34.993122 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5" (OuterVolumeSpecName: "kube-api-access-xmlx5") pod "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" (UID: "0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e"). InnerVolumeSpecName "kube-api-access-xmlx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:35.016247 master-0 kubenswrapper[7648]: I0308 03:12:35.010223 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:35.024795 master-0 kubenswrapper[7648]: I0308 03:12:35.020770 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:12:35.027045 master-0 kubenswrapper[7648]: W0308 03:12:35.026262 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode81d3c37_e8d7_44c8_973e_13992380ce85.slice/crio-33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30 WatchSource:0}: Error finding container 33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30: Status 404 returned error can't find the container with id 33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30 Mar 08 03:12:35.029439 master-0 kubenswrapper[7648]: W0308 03:12:35.028775 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4acb9694_2b7d_4684_907b_e321d71b3f8a.slice/crio-2e9acc31b88d2a78a7d317aebadb3f5a9484ef74f0da66817c67637f29032c81 WatchSource:0}: Error finding container 2e9acc31b88d2a78a7d317aebadb3f5a9484ef74f0da66817c67637f29032c81: Status 404 returned error can't find the container with id 2e9acc31b88d2a78a7d317aebadb3f5a9484ef74f0da66817c67637f29032c81 Mar 08 03:12:35.078342 master-0 kubenswrapper[7648]: I0308 03:12:35.078294 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca\") pod \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " Mar 08 03:12:35.078504 master-0 kubenswrapper[7648]: I0308 03:12:35.078417 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqlj\" (UniqueName: \"kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj\") pod \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " Mar 08 03:12:35.078504 master-0 kubenswrapper[7648]: I0308 03:12:35.078456 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles\") pod \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " Mar 08 03:12:35.078573 master-0 kubenswrapper[7648]: I0308 03:12:35.078519 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert\") pod \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " Mar 08 03:12:35.078608 master-0 kubenswrapper[7648]: I0308 03:12:35.078590 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config\") pod \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\" (UID: \"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c\") " Mar 08 03:12:35.078869 master-0 kubenswrapper[7648]: I0308 03:12:35.078843 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.078919 master-0 kubenswrapper[7648]: I0308 03:12:35.078875 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.078919 master-0 kubenswrapper[7648]: I0308 03:12:35.078889 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.078970 master-0 kubenswrapper[7648]: I0308 03:12:35.078931 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmlx5\" (UniqueName: \"kubernetes.io/projected/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e-kube-api-access-xmlx5\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.079413 master-0 kubenswrapper[7648]: I0308 03:12:35.079382 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" (UID: "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:35.079594 master-0 kubenswrapper[7648]: I0308 03:12:35.079566 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config" (OuterVolumeSpecName: "config") pod "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" (UID: "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:35.079913 master-0 kubenswrapper[7648]: I0308 03:12:35.079851 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" (UID: "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:35.088070 master-0 kubenswrapper[7648]: I0308 03:12:35.088006 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" (UID: "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:35.088834 master-0 kubenswrapper[7648]: I0308 03:12:35.088797 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj" (OuterVolumeSpecName: "kube-api-access-zbqlj") pod "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" (UID: "476c0b13-b27f-4ccf-bcf4-6a97d7525d1c"). InnerVolumeSpecName "kube-api-access-zbqlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:35.180340 master-0 kubenswrapper[7648]: I0308 03:12:35.180282 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.180572 master-0 kubenswrapper[7648]: I0308 03:12:35.180561 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqlj\" (UniqueName: \"kubernetes.io/projected/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-kube-api-access-zbqlj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.180632 master-0 kubenswrapper[7648]: I0308 03:12:35.180622 7648 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.180695 master-0 kubenswrapper[7648]: I0308 03:12:35.180685 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.180754 master-0 kubenswrapper[7648]: I0308 03:12:35.180745 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:35.276392 master-0 kubenswrapper[7648]: I0308 03:12:35.275444 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerStarted","Data":"67b6371de1e40f11492bdbedad65b4bb4c5dafeb7f94b97c8372fcadf4c1308d"} Mar 08 03:12:35.276392 master-0 kubenswrapper[7648]: I0308 03:12:35.276341 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:12:35.278024 master-0 kubenswrapper[7648]: I0308 03:12:35.277982 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"e81d3c37-e8d7-44c8-973e-13992380ce85","Type":"ContainerStarted","Data":"33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30"} Mar 08 03:12:35.279180 master-0 kubenswrapper[7648]: I0308 03:12:35.279161 7648 generic.go:334] "Generic (PLEG): container finished" podID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerID="e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a" exitCode=0 Mar 08 03:12:35.279422 master-0 kubenswrapper[7648]: I0308 03:12:35.279312 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" event={"ID":"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e","Type":"ContainerDied","Data":"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a"} Mar 08 03:12:35.279542 master-0 kubenswrapper[7648]: I0308 03:12:35.279528 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" event={"ID":"0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e","Type":"ContainerDied","Data":"054eef96a95fe907231f464fab11b404c512c2aaa4fbea6c8dc2730252ffddc1"} Mar 08 03:12:35.279609 master-0 kubenswrapper[7648]: I0308 03:12:35.279598 7648 scope.go:117] "RemoveContainer" containerID="e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a" Mar 08 03:12:35.279728 master-0 kubenswrapper[7648]: I0308 03:12:35.279441 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc" Mar 08 03:12:35.282411 master-0 kubenswrapper[7648]: I0308 03:12:35.282384 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_534f9589-9c0c-48aa-a5a8-fddf36d2d562/installer/0.log" Mar 08 03:12:35.282495 master-0 kubenswrapper[7648]: I0308 03:12:35.282456 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"534f9589-9c0c-48aa-a5a8-fddf36d2d562","Type":"ContainerDied","Data":"8105f30e57393894ce687c653037e150aaf5e1d89aca93b913e5f23cc6aca59e"} Mar 08 03:12:35.282576 master-0 kubenswrapper[7648]: I0308 03:12:35.282556 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 08 03:12:35.293130 master-0 kubenswrapper[7648]: I0308 03:12:35.293019 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" event={"ID":"4f822854-b9ac-46f2-b03b-e7215fba9208","Type":"ContainerStarted","Data":"e3e7374837cb65747f3de10028260654cf681118c857bee50afe05386bbc4b6b"} Mar 08 03:12:35.293785 master-0 kubenswrapper[7648]: I0308 03:12:35.293739 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:35.300343 master-0 kubenswrapper[7648]: I0308 03:12:35.299919 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" event={"ID":"d83aa242-606f-4adc-b689-4aa89625b533","Type":"ContainerStarted","Data":"d6ec2f57434a1d0325f6dee697b145f81fe4a56ad353b6bf09f704b41bdf8e7e"} Mar 08 03:12:35.300343 master-0 kubenswrapper[7648]: I0308 03:12:35.300010 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:35.301373 master-0 kubenswrapper[7648]: I0308 03:12:35.301340 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:12:35.303673 master-0 kubenswrapper[7648]: I0308 03:12:35.303636 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"4acb9694-2b7d-4684-907b-e321d71b3f8a","Type":"ContainerStarted","Data":"2e9acc31b88d2a78a7d317aebadb3f5a9484ef74f0da66817c67637f29032c81"} Mar 08 03:12:35.303900 master-0 kubenswrapper[7648]: I0308 03:12:35.303876 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:12:35.310690 master-0 kubenswrapper[7648]: I0308 03:12:35.310658 7648 generic.go:334] "Generic (PLEG): container finished" podID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" containerID="55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a" exitCode=0 Mar 08 03:12:35.310799 master-0 kubenswrapper[7648]: I0308 03:12:35.310781 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" event={"ID":"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c","Type":"ContainerDied","Data":"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a"} Mar 08 03:12:35.310872 master-0 kubenswrapper[7648]: I0308 03:12:35.310861 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" event={"ID":"476c0b13-b27f-4ccf-bcf4-6a97d7525d1c","Type":"ContainerDied","Data":"ad8221884944f8d23c4ee1643b7a86a37022bffdd15185cbce466ba1d7ac80d8"} Mar 08 03:12:35.311022 master-0 kubenswrapper[7648]: I0308 03:12:35.310993 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b855446cf-f998w" Mar 08 03:12:35.313953 master-0 kubenswrapper[7648]: I0308 03:12:35.312934 7648 scope.go:117] "RemoveContainer" containerID="e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a" Mar 08 03:12:35.313953 master-0 kubenswrapper[7648]: E0308 03:12:35.313253 7648 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a\": container with ID starting with e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a not found: ID does not exist" containerID="e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a" Mar 08 03:12:35.313953 master-0 kubenswrapper[7648]: I0308 03:12:35.313282 7648 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a"} err="failed to get container status \"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a\": rpc error: code = NotFound desc = could not find container \"e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a\": container with ID starting with e5d3858e866a854cb9e156ac54a5a898be0f07dcba2c4116aa4a338ffebb859a not found: ID does not exist" Mar 08 03:12:35.313953 master-0 kubenswrapper[7648]: I0308 03:12:35.313307 7648 scope.go:117] "RemoveContainer" containerID="d5b621a3334abe2132d33e21101dd76b1c492821c93abc7d97ebb298b5cc4bfc" Mar 08 03:12:35.329184 master-0 kubenswrapper[7648]: I0308 03:12:35.329121 7648 scope.go:117] "RemoveContainer" containerID="55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a" Mar 08 03:12:35.336477 master-0 kubenswrapper[7648]: I0308 03:12:35.334250 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:35.341354 master-0 kubenswrapper[7648]: I0308 03:12:35.341292 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5874dc4b9-7nmkc"] Mar 08 03:12:35.354761 master-0 kubenswrapper[7648]: I0308 03:12:35.354722 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:35.356795 master-0 kubenswrapper[7648]: I0308 03:12:35.356750 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 08 03:12:35.359542 master-0 kubenswrapper[7648]: I0308 03:12:35.359515 7648 scope.go:117] "RemoveContainer" containerID="55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a" Mar 08 03:12:35.362000 master-0 kubenswrapper[7648]: E0308 03:12:35.361961 7648 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a\": container with ID starting with 55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a not found: ID does not exist" containerID="55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a" Mar 08 03:12:35.362079 master-0 kubenswrapper[7648]: I0308 03:12:35.361998 7648 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a"} err="failed to get container status \"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a\": rpc error: code = NotFound desc = could not find container \"55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a\": container with ID starting with 55b7b6908eb460211048407de80632c2d47893467317492f8e2d166a816c4c6a not found: ID does not exist" Mar 08 03:12:35.461198 master-0 kubenswrapper[7648]: I0308 03:12:35.460573 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:35.467147 master-0 kubenswrapper[7648]: I0308 03:12:35.467019 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b855446cf-f998w"] Mar 08 03:12:35.574665 master-0 kubenswrapper[7648]: I0308 03:12:35.574502 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:35.575390 master-0 kubenswrapper[7648]: I0308 03:12:35.574853 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:35.583137 master-0 kubenswrapper[7648]: I0308 03:12:35.583092 7648 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:35.619834 master-0 kubenswrapper[7648]: I0308 03:12:35.619769 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" path="/var/lib/kubelet/pods/0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e/volumes" Mar 08 03:12:35.620931 master-0 kubenswrapper[7648]: I0308 03:12:35.620879 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" path="/var/lib/kubelet/pods/476c0b13-b27f-4ccf-bcf4-6a97d7525d1c/volumes" Mar 08 03:12:35.621827 master-0 kubenswrapper[7648]: I0308 03:12:35.621780 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" path="/var/lib/kubelet/pods/534f9589-9c0c-48aa-a5a8-fddf36d2d562/volumes" Mar 08 03:12:35.923569 master-0 kubenswrapper[7648]: I0308 03:12:35.923525 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: E0308 03:12:35.923705 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" containerName="installer" Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: I0308 03:12:35.923718 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" containerName="installer" Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: E0308 03:12:35.923730 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerName="route-controller-manager" Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: I0308 03:12:35.923736 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerName="route-controller-manager" Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: E0308 03:12:35.923748 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" containerName="controller-manager" Mar 08 03:12:35.923759 master-0 kubenswrapper[7648]: I0308 03:12:35.923754 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" containerName="controller-manager" Mar 08 03:12:35.924008 master-0 kubenswrapper[7648]: I0308 03:12:35.923826 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="476c0b13-b27f-4ccf-bcf4-6a97d7525d1c" containerName="controller-manager" Mar 08 03:12:35.924008 master-0 kubenswrapper[7648]: I0308 03:12:35.923837 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea3d1ee-7c91-4b30-9813-8ae1d1afde7e" containerName="route-controller-manager" Mar 08 03:12:35.924008 master-0 kubenswrapper[7648]: I0308 03:12:35.923845 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="534f9589-9c0c-48aa-a5a8-fddf36d2d562" containerName="installer" Mar 08 03:12:35.924395 master-0 kubenswrapper[7648]: I0308 03:12:35.924372 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:35.937579 master-0 kubenswrapper[7648]: I0308 03:12:35.937534 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:12:36.094113 master-0 kubenswrapper[7648]: I0308 03:12:36.094057 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.094303 master-0 kubenswrapper[7648]: I0308 03:12:36.094214 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.094405 master-0 kubenswrapper[7648]: I0308 03:12:36.094376 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.115474 master-0 kubenswrapper[7648]: I0308 03:12:36.115372 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:12:36.116580 master-0 kubenswrapper[7648]: I0308 03:12:36.116550 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.127710 master-0 kubenswrapper[7648]: I0308 03:12:36.127671 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:12:36.195549 master-0 kubenswrapper[7648]: I0308 03:12:36.195415 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.195806 master-0 kubenswrapper[7648]: I0308 03:12:36.195787 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.195903 master-0 kubenswrapper[7648]: I0308 03:12:36.195889 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.196662 master-0 kubenswrapper[7648]: I0308 03:12:36.196610 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.196711 master-0 kubenswrapper[7648]: I0308 03:12:36.196632 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.220274 master-0 kubenswrapper[7648]: I0308 03:12:36.220209 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.248887 master-0 kubenswrapper[7648]: I0308 03:12:36.248834 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:12:36.312669 master-0 kubenswrapper[7648]: I0308 03:12:36.296825 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.312669 master-0 kubenswrapper[7648]: I0308 03:12:36.296901 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.312669 master-0 kubenswrapper[7648]: I0308 03:12:36.296974 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.330717 master-0 kubenswrapper[7648]: I0308 03:12:36.326590 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"4acb9694-2b7d-4684-907b-e321d71b3f8a","Type":"ContainerStarted","Data":"ecdc56faed81edd2f802f65988bb413c62d0d2476736fb84c13832adfb3985e8"} Mar 08 03:12:36.330717 master-0 kubenswrapper[7648]: I0308 03:12:36.330401 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"e81d3c37-e8d7-44c8-973e-13992380ce85","Type":"ContainerStarted","Data":"d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e"} Mar 08 03:12:36.349311 master-0 kubenswrapper[7648]: I0308 03:12:36.349121 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:12:36.356394 master-0 kubenswrapper[7648]: I0308 03:12:36.356333 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=3.356312972 podStartE2EDuration="3.356312972s" podCreationTimestamp="2026-03-08 03:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:36.353623159 +0000 UTC m=+48.964941489" watchObservedRunningTime="2026-03-08 03:12:36.356312972 +0000 UTC m=+48.967631272" Mar 08 03:12:36.381294 master-0 kubenswrapper[7648]: I0308 03:12:36.381217 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=5.38119839 podStartE2EDuration="5.38119839s" podCreationTimestamp="2026-03-08 03:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:36.377945578 +0000 UTC m=+48.989263878" watchObservedRunningTime="2026-03-08 03:12:36.38119839 +0000 UTC m=+48.992516690" Mar 08 03:12:36.398144 master-0 kubenswrapper[7648]: I0308 03:12:36.398096 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.398144 master-0 kubenswrapper[7648]: I0308 03:12:36.398140 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.398324 master-0 kubenswrapper[7648]: I0308 03:12:36.398176 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.398632 master-0 kubenswrapper[7648]: I0308 03:12:36.398612 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.398998 master-0 kubenswrapper[7648]: I0308 03:12:36.398955 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.416678 master-0 kubenswrapper[7648]: I0308 03:12:36.416596 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.434036 master-0 kubenswrapper[7648]: I0308 03:12:36.433998 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:12:36.715344 master-0 kubenswrapper[7648]: I0308 03:12:36.715272 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:12:36.725722 master-0 kubenswrapper[7648]: W0308 03:12:36.725668 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e324f6c_ee4c_42bc_b241_9c6938749854.slice/crio-9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f WatchSource:0}: Error finding container 9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f: Status 404 returned error can't find the container with id 9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f Mar 08 03:12:36.758890 master-0 kubenswrapper[7648]: I0308 03:12:36.755746 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:36.758890 master-0 kubenswrapper[7648]: I0308 03:12:36.756338 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:36.758890 master-0 kubenswrapper[7648]: I0308 03:12:36.756798 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.758890 master-0 kubenswrapper[7648]: I0308 03:12:36.757234 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.762557 master-0 kubenswrapper[7648]: I0308 03:12:36.761510 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:36.762557 master-0 kubenswrapper[7648]: I0308 03:12:36.761864 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:12:36.762557 master-0 kubenswrapper[7648]: I0308 03:12:36.762096 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:12:36.762557 master-0 kubenswrapper[7648]: I0308 03:12:36.762227 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:36.762557 master-0 kubenswrapper[7648]: I0308 03:12:36.762386 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:12:36.763018 master-0 kubenswrapper[7648]: I0308 03:12:36.762862 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:12:36.763018 master-0 kubenswrapper[7648]: I0308 03:12:36.762932 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l8646" Mar 08 03:12:36.764602 master-0 kubenswrapper[7648]: I0308 03:12:36.763153 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-xfphx" Mar 08 03:12:36.769129 master-0 kubenswrapper[7648]: I0308 03:12:36.765695 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:12:36.769129 master-0 kubenswrapper[7648]: I0308 03:12:36.765780 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:12:36.769129 master-0 kubenswrapper[7648]: I0308 03:12:36.765876 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:12:36.769129 master-0 kubenswrapper[7648]: I0308 03:12:36.766031 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:12:36.779042 master-0 kubenswrapper[7648]: I0308 03:12:36.776017 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:36.779216 master-0 kubenswrapper[7648]: I0308 03:12:36.779064 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:36.793656 master-0 kubenswrapper[7648]: I0308 03:12:36.793611 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:12:36.820528 master-0 kubenswrapper[7648]: I0308 03:12:36.820467 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820535 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820563 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820590 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820607 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820625 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820647 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msq9j\" (UniqueName: \"kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820667 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ml8w\" (UniqueName: \"kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.820683 master-0 kubenswrapper[7648]: I0308 03:12:36.820684 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.900272 master-0 kubenswrapper[7648]: I0308 03:12:36.900196 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:12:36.921630 master-0 kubenswrapper[7648]: I0308 03:12:36.921586 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.921706 master-0 kubenswrapper[7648]: I0308 03:12:36.921642 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.921706 master-0 kubenswrapper[7648]: I0308 03:12:36.921675 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.921764 master-0 kubenswrapper[7648]: I0308 03:12:36.921712 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msq9j\" (UniqueName: \"kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.921764 master-0 kubenswrapper[7648]: I0308 03:12:36.921747 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ml8w\" (UniqueName: \"kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.921819 master-0 kubenswrapper[7648]: I0308 03:12:36.921778 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.921848 master-0 kubenswrapper[7648]: I0308 03:12:36.921819 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.921879 master-0 kubenswrapper[7648]: I0308 03:12:36.921865 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.921929 master-0 kubenswrapper[7648]: I0308 03:12:36.921906 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.922990 master-0 kubenswrapper[7648]: I0308 03:12:36.922958 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.924277 master-0 kubenswrapper[7648]: I0308 03:12:36.924245 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.926068 master-0 kubenswrapper[7648]: I0308 03:12:36.925356 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.926302 master-0 kubenswrapper[7648]: I0308 03:12:36.926279 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.926461 master-0 kubenswrapper[7648]: I0308 03:12:36.926429 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.927191 master-0 kubenswrapper[7648]: I0308 03:12:36.927162 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.939040 master-0 kubenswrapper[7648]: I0308 03:12:36.938973 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:36.949667 master-0 kubenswrapper[7648]: I0308 03:12:36.949362 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msq9j\" (UniqueName: \"kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j\") pod \"route-controller-manager-5696bdc5b4-tqtbg\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:36.949874 master-0 kubenswrapper[7648]: I0308 03:12:36.949800 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ml8w\" (UniqueName: \"kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w\") pod \"controller-manager-55644b7446-5ckmr\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:37.022985 master-0 kubenswrapper[7648]: I0308 03:12:37.022853 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz"] Mar 08 03:12:37.023795 master-0 kubenswrapper[7648]: I0308 03:12:37.023769 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.028594 master-0 kubenswrapper[7648]: I0308 03:12:37.028546 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:12:37.028754 master-0 kubenswrapper[7648]: I0308 03:12:37.028700 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-chsmd" Mar 08 03:12:37.040768 master-0 kubenswrapper[7648]: I0308 03:12:37.040725 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz"] Mar 08 03:12:37.125022 master-0 kubenswrapper[7648]: I0308 03:12:37.124955 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982r4\" (UniqueName: \"kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.125207 master-0 kubenswrapper[7648]: I0308 03:12:37.125037 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.125207 master-0 kubenswrapper[7648]: I0308 03:12:37.125063 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.125207 master-0 kubenswrapper[7648]: I0308 03:12:37.125136 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.172172 master-0 kubenswrapper[7648]: I0308 03:12:37.172078 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:37.190523 master-0 kubenswrapper[7648]: I0308 03:12:37.189350 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:37.227290 master-0 kubenswrapper[7648]: I0308 03:12:37.225876 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.227290 master-0 kubenswrapper[7648]: I0308 03:12:37.225936 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-982r4\" (UniqueName: \"kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.227290 master-0 kubenswrapper[7648]: I0308 03:12:37.225976 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.227290 master-0 kubenswrapper[7648]: I0308 03:12:37.226000 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.227290 master-0 kubenswrapper[7648]: I0308 03:12:37.227244 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.230797 master-0 kubenswrapper[7648]: I0308 03:12:37.230551 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.230797 master-0 kubenswrapper[7648]: I0308 03:12:37.230631 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.244010 master-0 kubenswrapper[7648]: I0308 03:12:37.243960 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-982r4\" (UniqueName: \"kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.344773 master-0 kubenswrapper[7648]: I0308 03:12:37.344131 7648 generic.go:334] "Generic (PLEG): container finished" podID="7e324f6c-ee4c-42bc-b241-9c6938749854" containerID="f23bd786497d6c307edb85e8d774c9b8f2223af0ca9dc43c45c0639c00c00251" exitCode=0 Mar 08 03:12:37.344773 master-0 kubenswrapper[7648]: I0308 03:12:37.344218 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86z4t" event={"ID":"7e324f6c-ee4c-42bc-b241-9c6938749854","Type":"ContainerDied","Data":"f23bd786497d6c307edb85e8d774c9b8f2223af0ca9dc43c45c0639c00c00251"} Mar 08 03:12:37.344773 master-0 kubenswrapper[7648]: I0308 03:12:37.344305 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86z4t" event={"ID":"7e324f6c-ee4c-42bc-b241-9c6938749854","Type":"ContainerStarted","Data":"9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f"} Mar 08 03:12:37.351790 master-0 kubenswrapper[7648]: I0308 03:12:37.348581 7648 generic.go:334] "Generic (PLEG): container finished" podID="b05d5093-20f4-42d5-9db3-811e049cc1b6" containerID="9dcf81635de6906146c147e78ec6bda20f98dd55e53a8e7eb4bd3270e962f41e" exitCode=0 Mar 08 03:12:37.351790 master-0 kubenswrapper[7648]: I0308 03:12:37.349444 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r494d" event={"ID":"b05d5093-20f4-42d5-9db3-811e049cc1b6","Type":"ContainerDied","Data":"9dcf81635de6906146c147e78ec6bda20f98dd55e53a8e7eb4bd3270e962f41e"} Mar 08 03:12:37.351790 master-0 kubenswrapper[7648]: I0308 03:12:37.349534 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r494d" event={"ID":"b05d5093-20f4-42d5-9db3-811e049cc1b6","Type":"ContainerStarted","Data":"702795b7a3b9492f17a3552f3377a1320bf2ba8da965c8533a8f5f8dc47e6545"} Mar 08 03:12:37.371824 master-0 kubenswrapper[7648]: I0308 03:12:37.369859 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:37.522653 master-0 kubenswrapper[7648]: I0308 03:12:37.522208 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:12:37.523094 master-0 kubenswrapper[7648]: I0308 03:12:37.523077 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.529822 master-0 kubenswrapper[7648]: I0308 03:12:37.528300 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:12:37.529822 master-0 kubenswrapper[7648]: I0308 03:12:37.529723 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.529822 master-0 kubenswrapper[7648]: I0308 03:12:37.529762 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.529822 master-0 kubenswrapper[7648]: I0308 03:12:37.529796 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.589817 master-0 kubenswrapper[7648]: I0308 03:12:37.588809 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:37.597240 master-0 kubenswrapper[7648]: W0308 03:12:37.595132 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64620a50_b5de_4b7f_84a3_a2df9d7da9fe.slice/crio-f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714 WatchSource:0}: Error finding container f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714: Status 404 returned error can't find the container with id f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714 Mar 08 03:12:37.640048 master-0 kubenswrapper[7648]: I0308 03:12:37.637107 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.645119 master-0 kubenswrapper[7648]: I0308 03:12:37.640974 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.645119 master-0 kubenswrapper[7648]: I0308 03:12:37.641055 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.645119 master-0 kubenswrapper[7648]: I0308 03:12:37.644897 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.645119 master-0 kubenswrapper[7648]: I0308 03:12:37.645067 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.661812 master-0 kubenswrapper[7648]: I0308 03:12:37.661694 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:37.661812 master-0 kubenswrapper[7648]: I0308 03:12:37.661724 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.674204 master-0 kubenswrapper[7648]: W0308 03:12:37.674177 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49384416_21b1_4d87_9ab4_77f0efbb9ff8.slice/crio-0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85 WatchSource:0}: Error finding container 0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85: Status 404 returned error can't find the container with id 0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85 Mar 08 03:12:37.835414 master-0 kubenswrapper[7648]: I0308 03:12:37.835379 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz"] Mar 08 03:12:37.845207 master-0 kubenswrapper[7648]: I0308 03:12:37.845149 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:12:37.847787 master-0 kubenswrapper[7648]: W0308 03:12:37.847754 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfebf6a91_8b78_4b22_93b9_155cb7761fc4.slice/crio-9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd WatchSource:0}: Error finding container 9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd: Status 404 returned error can't find the container with id 9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd Mar 08 03:12:38.078938 master-0 kubenswrapper[7648]: I0308 03:12:38.078138 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:12:38.089866 master-0 kubenswrapper[7648]: W0308 03:12:38.089653 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6e3f01_0f22_4961_b450_56aca5477943.slice/crio-774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed WatchSource:0}: Error finding container 774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed: Status 404 returned error can't find the container with id 774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed Mar 08 03:12:38.356716 master-0 kubenswrapper[7648]: I0308 03:12:38.354678 7648 generic.go:334] "Generic (PLEG): container finished" podID="1a6e3f01-0f22-4961-b450-56aca5477943" containerID="efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8" exitCode=0 Mar 08 03:12:38.356716 master-0 kubenswrapper[7648]: I0308 03:12:38.356648 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nggbb" event={"ID":"1a6e3f01-0f22-4961-b450-56aca5477943","Type":"ContainerDied","Data":"efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8"} Mar 08 03:12:38.356716 master-0 kubenswrapper[7648]: I0308 03:12:38.356677 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nggbb" event={"ID":"1a6e3f01-0f22-4961-b450-56aca5477943","Type":"ContainerStarted","Data":"774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed"} Mar 08 03:12:38.359070 master-0 kubenswrapper[7648]: I0308 03:12:38.358701 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" event={"ID":"64620a50-b5de-4b7f-84a3-a2df9d7da9fe","Type":"ContainerStarted","Data":"44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e"} Mar 08 03:12:38.359070 master-0 kubenswrapper[7648]: I0308 03:12:38.358770 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" event={"ID":"64620a50-b5de-4b7f-84a3-a2df9d7da9fe","Type":"ContainerStarted","Data":"f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714"} Mar 08 03:12:38.361191 master-0 kubenswrapper[7648]: I0308 03:12:38.361134 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:38.362683 master-0 kubenswrapper[7648]: I0308 03:12:38.362627 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" event={"ID":"49384416-21b1-4d87-9ab4-77f0efbb9ff8","Type":"ContainerStarted","Data":"29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f"} Mar 08 03:12:38.362683 master-0 kubenswrapper[7648]: I0308 03:12:38.362663 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" event={"ID":"49384416-21b1-4d87-9ab4-77f0efbb9ff8","Type":"ContainerStarted","Data":"0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85"} Mar 08 03:12:38.366350 master-0 kubenswrapper[7648]: I0308 03:12:38.364096 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:38.366350 master-0 kubenswrapper[7648]: I0308 03:12:38.364429 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:38.367302 master-0 kubenswrapper[7648]: I0308 03:12:38.366540 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" event={"ID":"febf6a91-8b78-4b22-93b9-155cb7761fc4","Type":"ContainerStarted","Data":"cc81df0ba4ab9eeb86b1b242fe232cbbf2baf35b697815e632689fd0077b5f0f"} Mar 08 03:12:38.367302 master-0 kubenswrapper[7648]: I0308 03:12:38.366563 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:38.367302 master-0 kubenswrapper[7648]: I0308 03:12:38.366572 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" event={"ID":"febf6a91-8b78-4b22-93b9-155cb7761fc4","Type":"ContainerStarted","Data":"9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd"} Mar 08 03:12:38.367720 master-0 kubenswrapper[7648]: I0308 03:12:38.367578 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:38.376291 master-0 kubenswrapper[7648]: E0308 03:12:38.376200 7648 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6e3f01_0f22_4961_b450_56aca5477943.slice/crio-conmon-efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6e3f01_0f22_4961_b450_56aca5477943.slice/crio-efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8.scope\": RecentStats: unable to find data in memory cache]" Mar 08 03:12:38.433691 master-0 kubenswrapper[7648]: I0308 03:12:38.433326 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" podStartSLOduration=7.433303025 podStartE2EDuration="7.433303025s" podCreationTimestamp="2026-03-08 03:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:38.397636645 +0000 UTC m=+51.008954935" watchObservedRunningTime="2026-03-08 03:12:38.433303025 +0000 UTC m=+51.044621315" Mar 08 03:12:38.436133 master-0 kubenswrapper[7648]: I0308 03:12:38.436098 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" podStartSLOduration=2.436088302 podStartE2EDuration="2.436088302s" podCreationTimestamp="2026-03-08 03:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:38.425972682 +0000 UTC m=+51.037290972" watchObservedRunningTime="2026-03-08 03:12:38.436088302 +0000 UTC m=+51.047406592" Mar 08 03:12:38.720838 master-0 kubenswrapper[7648]: I0308 03:12:38.720602 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" podStartSLOduration=7.7205847389999995 podStartE2EDuration="7.720584739s" podCreationTimestamp="2026-03-08 03:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:38.469714332 +0000 UTC m=+51.081032612" watchObservedRunningTime="2026-03-08 03:12:38.720584739 +0000 UTC m=+51.331903029" Mar 08 03:12:38.720838 master-0 kubenswrapper[7648]: I0308 03:12:38.720825 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zm8fd"] Mar 08 03:12:38.722330 master-0 kubenswrapper[7648]: I0308 03:12:38.722308 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.724385 master-0 kubenswrapper[7648]: I0308 03:12:38.724309 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-4dw5m" Mar 08 03:12:38.739544 master-0 kubenswrapper[7648]: I0308 03:12:38.739467 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm8fd"] Mar 08 03:12:38.781979 master-0 kubenswrapper[7648]: I0308 03:12:38.781659 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:12:38.867414 master-0 kubenswrapper[7648]: I0308 03:12:38.867334 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.867933 master-0 kubenswrapper[7648]: I0308 03:12:38.867455 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vvq\" (UniqueName: \"kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.867933 master-0 kubenswrapper[7648]: I0308 03:12:38.867552 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.969374 master-0 kubenswrapper[7648]: I0308 03:12:38.969240 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.969374 master-0 kubenswrapper[7648]: I0308 03:12:38.969310 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vvq\" (UniqueName: \"kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.969374 master-0 kubenswrapper[7648]: I0308 03:12:38.969334 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.971626 master-0 kubenswrapper[7648]: I0308 03:12:38.969901 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:38.971626 master-0 kubenswrapper[7648]: I0308 03:12:38.970180 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:39.158800 master-0 kubenswrapper[7648]: I0308 03:12:39.158750 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vvq\" (UniqueName: \"kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:39.372403 master-0 kubenswrapper[7648]: I0308 03:12:39.372344 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:12:41.309886 master-0 kubenswrapper[7648]: I0308 03:12:41.309798 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:41.311232 master-0 kubenswrapper[7648]: I0308 03:12:41.310114 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="4acb9694-2b7d-4684-907b-e321d71b3f8a" containerName="installer" containerID="cri-o://ecdc56faed81edd2f802f65988bb413c62d0d2476736fb84c13832adfb3985e8" gracePeriod=30 Mar 08 03:12:41.317992 master-0 kubenswrapper[7648]: I0308 03:12:41.315964 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm8fd"] Mar 08 03:12:41.401903 master-0 kubenswrapper[7648]: I0308 03:12:41.401798 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm8fd" event={"ID":"6a9d0240-fc00-4d78-9458-8f53b1876f1b","Type":"ContainerStarted","Data":"cb298ff85bc6afefe78d9670cec4232d77064bf8eb867d648f99dcfde97ded03"} Mar 08 03:12:41.537554 master-0 kubenswrapper[7648]: I0308 03:12:41.537417 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:12:42.202655 master-0 kubenswrapper[7648]: I0308 03:12:42.199086 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zm92r"] Mar 08 03:12:42.202655 master-0 kubenswrapper[7648]: I0308 03:12:42.200364 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.202655 master-0 kubenswrapper[7648]: I0308 03:12:42.202387 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-lwkgm" Mar 08 03:12:42.386914 master-0 kubenswrapper[7648]: I0308 03:12:42.386775 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58bm\" (UniqueName: \"kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.386914 master-0 kubenswrapper[7648]: I0308 03:12:42.386901 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.387865 master-0 kubenswrapper[7648]: I0308 03:12:42.386936 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.421970 master-0 kubenswrapper[7648]: I0308 03:12:42.421832 7648 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0240-fc00-4d78-9458-8f53b1876f1b" containerID="f9107d4fe9e5fd8ff1dc1c072f33a8b790e39886ea3bd32d4664e530799cf713" exitCode=0 Mar 08 03:12:42.421970 master-0 kubenswrapper[7648]: I0308 03:12:42.421891 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm8fd" event={"ID":"6a9d0240-fc00-4d78-9458-8f53b1876f1b","Type":"ContainerDied","Data":"f9107d4fe9e5fd8ff1dc1c072f33a8b790e39886ea3bd32d4664e530799cf713"} Mar 08 03:12:42.426101 master-0 kubenswrapper[7648]: I0308 03:12:42.425600 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_4acb9694-2b7d-4684-907b-e321d71b3f8a/installer/0.log" Mar 08 03:12:42.426101 master-0 kubenswrapper[7648]: I0308 03:12:42.425654 7648 generic.go:334] "Generic (PLEG): container finished" podID="4acb9694-2b7d-4684-907b-e321d71b3f8a" containerID="ecdc56faed81edd2f802f65988bb413c62d0d2476736fb84c13832adfb3985e8" exitCode=1 Mar 08 03:12:42.426101 master-0 kubenswrapper[7648]: I0308 03:12:42.425689 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"4acb9694-2b7d-4684-907b-e321d71b3f8a","Type":"ContainerDied","Data":"ecdc56faed81edd2f802f65988bb413c62d0d2476736fb84c13832adfb3985e8"} Mar 08 03:12:42.488858 master-0 kubenswrapper[7648]: I0308 03:12:42.488806 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.488994 master-0 kubenswrapper[7648]: I0308 03:12:42.488871 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.488994 master-0 kubenswrapper[7648]: I0308 03:12:42.488906 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58bm\" (UniqueName: \"kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.490308 master-0 kubenswrapper[7648]: I0308 03:12:42.489984 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.490308 master-0 kubenswrapper[7648]: I0308 03:12:42.490263 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:42.561733 master-0 kubenswrapper[7648]: I0308 03:12:42.561685 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_4acb9694-2b7d-4684-907b-e321d71b3f8a/installer/0.log" Mar 08 03:12:42.561916 master-0 kubenswrapper[7648]: I0308 03:12:42.561763 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:42.632944 master-0 kubenswrapper[7648]: I0308 03:12:42.632161 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zm92r"] Mar 08 03:12:42.692978 master-0 kubenswrapper[7648]: I0308 03:12:42.692924 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access\") pod \"4acb9694-2b7d-4684-907b-e321d71b3f8a\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " Mar 08 03:12:42.693189 master-0 kubenswrapper[7648]: I0308 03:12:42.693023 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock\") pod \"4acb9694-2b7d-4684-907b-e321d71b3f8a\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " Mar 08 03:12:42.693189 master-0 kubenswrapper[7648]: I0308 03:12:42.693088 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir\") pod \"4acb9694-2b7d-4684-907b-e321d71b3f8a\" (UID: \"4acb9694-2b7d-4684-907b-e321d71b3f8a\") " Mar 08 03:12:42.693398 master-0 kubenswrapper[7648]: I0308 03:12:42.693372 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4acb9694-2b7d-4684-907b-e321d71b3f8a" (UID: "4acb9694-2b7d-4684-907b-e321d71b3f8a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:42.694114 master-0 kubenswrapper[7648]: I0308 03:12:42.693854 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock" (OuterVolumeSpecName: "var-lock") pod "4acb9694-2b7d-4684-907b-e321d71b3f8a" (UID: "4acb9694-2b7d-4684-907b-e321d71b3f8a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:42.702887 master-0 kubenswrapper[7648]: I0308 03:12:42.702816 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4acb9694-2b7d-4684-907b-e321d71b3f8a" (UID: "4acb9694-2b7d-4684-907b-e321d71b3f8a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:42.794337 master-0 kubenswrapper[7648]: I0308 03:12:42.794170 7648 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:42.794337 master-0 kubenswrapper[7648]: I0308 03:12:42.794224 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4acb9694-2b7d-4684-907b-e321d71b3f8a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:42.794337 master-0 kubenswrapper[7648]: I0308 03:12:42.794239 7648 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4acb9694-2b7d-4684-907b-e321d71b3f8a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:43.056016 master-0 kubenswrapper[7648]: I0308 03:12:43.055889 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:12:43.056016 master-0 kubenswrapper[7648]: I0308 03:12:43.055967 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:12:43.056248 master-0 kubenswrapper[7648]: E0308 03:12:43.056206 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acb9694-2b7d-4684-907b-e321d71b3f8a" containerName="installer" Mar 08 03:12:43.056248 master-0 kubenswrapper[7648]: I0308 03:12:43.056218 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acb9694-2b7d-4684-907b-e321d71b3f8a" containerName="installer" Mar 08 03:12:43.056446 master-0 kubenswrapper[7648]: I0308 03:12:43.056317 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acb9694-2b7d-4684-907b-e321d71b3f8a" containerName="installer" Mar 08 03:12:43.056729 master-0 kubenswrapper[7648]: I0308 03:12:43.056703 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.057215 master-0 kubenswrapper[7648]: I0308 03:12:43.057166 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:12:43.057947 master-0 kubenswrapper[7648]: I0308 03:12:43.057908 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.058740 master-0 kubenswrapper[7648]: I0308 03:12:43.058692 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-68q4j" Mar 08 03:12:43.058841 master-0 kubenswrapper[7648]: I0308 03:12:43.058785 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 03:12:43.063526 master-0 kubenswrapper[7648]: I0308 03:12:43.063458 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58bm\" (UniqueName: \"kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:43.122845 master-0 kubenswrapper[7648]: I0308 03:12:43.122793 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.198785 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200281 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200349 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200375 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200422 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200456 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.204507 master-0 kubenswrapper[7648]: I0308 03:12:43.200496 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.219978 master-0 kubenswrapper[7648]: I0308 03:12:43.219877 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:12:43.301501 master-0 kubenswrapper[7648]: I0308 03:12:43.301302 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.301501 master-0 kubenswrapper[7648]: I0308 03:12:43.301358 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.301501 master-0 kubenswrapper[7648]: I0308 03:12:43.301377 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.301501 master-0 kubenswrapper[7648]: I0308 03:12:43.301396 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.301501 master-0 kubenswrapper[7648]: I0308 03:12:43.301463 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.301821 master-0 kubenswrapper[7648]: I0308 03:12:43.301529 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.301821 master-0 kubenswrapper[7648]: I0308 03:12:43.301551 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.301821 master-0 kubenswrapper[7648]: I0308 03:12:43.301614 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.301821 master-0 kubenswrapper[7648]: I0308 03:12:43.301636 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.301821 master-0 kubenswrapper[7648]: I0308 03:12:43.301773 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.433642 master-0 kubenswrapper[7648]: I0308 03:12:43.433602 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_4acb9694-2b7d-4684-907b-e321d71b3f8a/installer/0.log" Mar 08 03:12:43.434161 master-0 kubenswrapper[7648]: I0308 03:12:43.433661 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"4acb9694-2b7d-4684-907b-e321d71b3f8a","Type":"ContainerDied","Data":"2e9acc31b88d2a78a7d317aebadb3f5a9484ef74f0da66817c67637f29032c81"} Mar 08 03:12:43.434161 master-0 kubenswrapper[7648]: I0308 03:12:43.433703 7648 scope.go:117] "RemoveContainer" containerID="ecdc56faed81edd2f802f65988bb413c62d0d2476736fb84c13832adfb3985e8" Mar 08 03:12:43.434161 master-0 kubenswrapper[7648]: I0308 03:12:43.433748 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 08 03:12:43.651769 master-0 kubenswrapper[7648]: I0308 03:12:43.651723 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jnlct"] Mar 08 03:12:43.652672 master-0 kubenswrapper[7648]: I0308 03:12:43.652653 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.654445 master-0 kubenswrapper[7648]: I0308 03:12:43.654426 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-5lx9s" Mar 08 03:12:43.714563 master-0 kubenswrapper[7648]: I0308 03:12:43.711109 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnlct"] Mar 08 03:12:43.723064 master-0 kubenswrapper[7648]: I0308 03:12:43.722525 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:43.732564 master-0 kubenswrapper[7648]: I0308 03:12:43.730340 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:43.759443 master-0 kubenswrapper[7648]: I0308 03:12:43.746622 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:12:43.771562 master-0 kubenswrapper[7648]: I0308 03:12:43.770249 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zm92r"] Mar 08 03:12:43.814059 master-0 kubenswrapper[7648]: I0308 03:12:43.813974 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jsw\" (UniqueName: \"kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.814267 master-0 kubenswrapper[7648]: I0308 03:12:43.814116 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.814375 master-0 kubenswrapper[7648]: I0308 03:12:43.814353 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.823278 master-0 kubenswrapper[7648]: I0308 03:12:43.822278 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:43.833760 master-0 kubenswrapper[7648]: I0308 03:12:43.833700 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 08 03:12:43.847074 master-0 kubenswrapper[7648]: W0308 03:12:43.846956 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68309159_130a_4ffa_acec_95dc4b795b8f.slice/crio-00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796 WatchSource:0}: Error finding container 00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796: Status 404 returned error can't find the container with id 00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796 Mar 08 03:12:43.870624 master-0 kubenswrapper[7648]: I0308 03:12:43.870575 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd"] Mar 08 03:12:43.871795 master-0 kubenswrapper[7648]: I0308 03:12:43.871746 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:43.875256 master-0 kubenswrapper[7648]: I0308 03:12:43.875237 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-dmv4m" Mar 08 03:12:43.878068 master-0 kubenswrapper[7648]: I0308 03:12:43.878012 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd"] Mar 08 03:12:43.882950 master-0 kubenswrapper[7648]: I0308 03:12:43.882517 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:12:43.920226 master-0 kubenswrapper[7648]: I0308 03:12:43.920168 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jsw\" (UniqueName: \"kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.920296 master-0 kubenswrapper[7648]: I0308 03:12:43.920263 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.920349 master-0 kubenswrapper[7648]: I0308 03:12:43.920326 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.920906 master-0 kubenswrapper[7648]: I0308 03:12:43.920858 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.921177 master-0 kubenswrapper[7648]: I0308 03:12:43.921136 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.922999 master-0 kubenswrapper[7648]: I0308 03:12:43.922967 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qh2"] Mar 08 03:12:43.924414 master-0 kubenswrapper[7648]: I0308 03:12:43.924396 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:43.932171 master-0 kubenswrapper[7648]: I0308 03:12:43.931754 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qktgm" Mar 08 03:12:43.939398 master-0 kubenswrapper[7648]: I0308 03:12:43.939345 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jsw\" (UniqueName: \"kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.957672 master-0 kubenswrapper[7648]: I0308 03:12:43.957627 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qh2"] Mar 08 03:12:43.966459 master-0 kubenswrapper[7648]: I0308 03:12:43.966359 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:12:43.993427 master-0 kubenswrapper[7648]: I0308 03:12:43.993382 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:12:44.005813 master-0 kubenswrapper[7648]: I0308 03:12:44.005770 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:12:44.021906 master-0 kubenswrapper[7648]: I0308 03:12:44.021868 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.021987 master-0 kubenswrapper[7648]: I0308 03:12:44.021913 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84q5n\" (UniqueName: \"kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.021987 master-0 kubenswrapper[7648]: I0308 03:12:44.021977 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7t9\" (UniqueName: \"kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.022059 master-0 kubenswrapper[7648]: I0308 03:12:44.022037 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.022202 master-0 kubenswrapper[7648]: I0308 03:12:44.022155 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.123360 master-0 kubenswrapper[7648]: I0308 03:12:44.123315 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7t9\" (UniqueName: \"kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.123635 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.123949 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.124057 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.124084 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84q5n\" (UniqueName: \"kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.124421 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.135055 master-0 kubenswrapper[7648]: I0308 03:12:44.124458 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.140891 master-0 kubenswrapper[7648]: I0308 03:12:44.138058 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.152037 master-0 kubenswrapper[7648]: I0308 03:12:44.144147 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7t9\" (UniqueName: \"kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.152037 master-0 kubenswrapper[7648]: I0308 03:12:44.144281 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84q5n\" (UniqueName: \"kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.199839 master-0 kubenswrapper[7648]: I0308 03:12:44.197801 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:12:44.266415 master-0 kubenswrapper[7648]: I0308 03:12:44.266373 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_b9b2988e-7fa3-44ee-be58-51964231a2ab/installer/0.log" Mar 08 03:12:44.266616 master-0 kubenswrapper[7648]: I0308 03:12:44.266438 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:44.309112 master-0 kubenswrapper[7648]: I0308 03:12:44.308830 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:12:44.326983 master-0 kubenswrapper[7648]: I0308 03:12:44.326931 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir\") pod \"b9b2988e-7fa3-44ee-be58-51964231a2ab\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " Mar 08 03:12:44.326983 master-0 kubenswrapper[7648]: I0308 03:12:44.326977 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock\") pod \"b9b2988e-7fa3-44ee-be58-51964231a2ab\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " Mar 08 03:12:44.327080 master-0 kubenswrapper[7648]: I0308 03:12:44.327040 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access\") pod \"b9b2988e-7fa3-44ee-be58-51964231a2ab\" (UID: \"b9b2988e-7fa3-44ee-be58-51964231a2ab\") " Mar 08 03:12:44.327762 master-0 kubenswrapper[7648]: I0308 03:12:44.327573 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b9b2988e-7fa3-44ee-be58-51964231a2ab" (UID: "b9b2988e-7fa3-44ee-be58-51964231a2ab"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:44.327762 master-0 kubenswrapper[7648]: I0308 03:12:44.327608 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock" (OuterVolumeSpecName: "var-lock") pod "b9b2988e-7fa3-44ee-be58-51964231a2ab" (UID: "b9b2988e-7fa3-44ee-be58-51964231a2ab"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:12:44.335765 master-0 kubenswrapper[7648]: I0308 03:12:44.335696 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b9b2988e-7fa3-44ee-be58-51964231a2ab" (UID: "b9b2988e-7fa3-44ee-be58-51964231a2ab"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:44.429355 master-0 kubenswrapper[7648]: I0308 03:12:44.428580 7648 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:44.429355 master-0 kubenswrapper[7648]: I0308 03:12:44.428623 7648 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9b2988e-7fa3-44ee-be58-51964231a2ab-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:44.429355 master-0 kubenswrapper[7648]: I0308 03:12:44.428635 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b2988e-7fa3-44ee-be58-51964231a2ab-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:44.444939 master-0 kubenswrapper[7648]: I0308 03:12:44.443352 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnlct"] Mar 08 03:12:44.448854 master-0 kubenswrapper[7648]: W0308 03:12:44.448810 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ab8f71_42b8_4967_8a0b_016647c59a37.slice/crio-26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c WatchSource:0}: Error finding container 26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c: Status 404 returned error can't find the container with id 26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c Mar 08 03:12:44.462425 master-0 kubenswrapper[7648]: I0308 03:12:44.462374 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 08 03:12:44.465263 master-0 kubenswrapper[7648]: I0308 03:12:44.465215 7648 generic.go:334] "Generic (PLEG): container finished" podID="68309159-130a-4ffa-acec-95dc4b795b8f" containerID="c9f7a2553aef09408038ebc72fb8e56d18eb9a842f8d18ad116a3d6714abc2f9" exitCode=0 Mar 08 03:12:44.465454 master-0 kubenswrapper[7648]: I0308 03:12:44.465415 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm92r" event={"ID":"68309159-130a-4ffa-acec-95dc4b795b8f","Type":"ContainerDied","Data":"c9f7a2553aef09408038ebc72fb8e56d18eb9a842f8d18ad116a3d6714abc2f9"} Mar 08 03:12:44.465515 master-0 kubenswrapper[7648]: I0308 03:12:44.465459 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm92r" event={"ID":"68309159-130a-4ffa-acec-95dc4b795b8f","Type":"ContainerStarted","Data":"00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796"} Mar 08 03:12:44.469935 master-0 kubenswrapper[7648]: I0308 03:12:44.469908 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_b9b2988e-7fa3-44ee-be58-51964231a2ab/installer/0.log" Mar 08 03:12:44.469974 master-0 kubenswrapper[7648]: I0308 03:12:44.469953 7648 generic.go:334] "Generic (PLEG): container finished" podID="b9b2988e-7fa3-44ee-be58-51964231a2ab" containerID="6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642" exitCode=1 Mar 08 03:12:44.470040 master-0 kubenswrapper[7648]: I0308 03:12:44.470009 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"b9b2988e-7fa3-44ee-be58-51964231a2ab","Type":"ContainerDied","Data":"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642"} Mar 08 03:12:44.470078 master-0 kubenswrapper[7648]: I0308 03:12:44.470043 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"b9b2988e-7fa3-44ee-be58-51964231a2ab","Type":"ContainerDied","Data":"dfd20110ce4cf1cfff31e419d57e6348705990b2bdc516a8aae4208278e8e44e"} Mar 08 03:12:44.470078 master-0 kubenswrapper[7648]: I0308 03:12:44.470063 7648 scope.go:117] "RemoveContainer" containerID="6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642" Mar 08 03:12:44.470166 master-0 kubenswrapper[7648]: I0308 03:12:44.470142 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 08 03:12:44.515706 master-0 kubenswrapper[7648]: I0308 03:12:44.514327 7648 scope.go:117] "RemoveContainer" containerID="6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642" Mar 08 03:12:44.517780 master-0 kubenswrapper[7648]: E0308 03:12:44.516109 7648 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642\": container with ID starting with 6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642 not found: ID does not exist" containerID="6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642" Mar 08 03:12:44.517780 master-0 kubenswrapper[7648]: I0308 03:12:44.516156 7648 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642"} err="failed to get container status \"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642\": rpc error: code = NotFound desc = could not find container \"6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642\": container with ID starting with 6d6224364f0f8bfc5a5484364018b7804269b531cfe2095f91e0c09b0658c642 not found: ID does not exist" Mar 08 03:12:44.546437 master-0 kubenswrapper[7648]: I0308 03:12:44.546391 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:44.548268 master-0 kubenswrapper[7648]: I0308 03:12:44.548234 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 08 03:12:44.599982 master-0 kubenswrapper[7648]: I0308 03:12:44.599878 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 08 03:12:44.611905 master-0 kubenswrapper[7648]: W0308 03:12:44.611868 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podacb74744_fb99_4663_a7d0_7bae2db205e9.slice/crio-a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176 WatchSource:0}: Error finding container a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176: Status 404 returned error can't find the container with id a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176 Mar 08 03:12:44.718752 master-0 kubenswrapper[7648]: I0308 03:12:44.718567 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd"] Mar 08 03:12:44.761314 master-0 kubenswrapper[7648]: I0308 03:12:44.761065 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qh2"] Mar 08 03:12:44.779465 master-0 kubenswrapper[7648]: W0308 03:12:44.779407 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2c9576_f7bd_4ac5_a7fe_530f26642f97.slice/crio-aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c WatchSource:0}: Error finding container aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c: Status 404 returned error can't find the container with id aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c Mar 08 03:12:44.780105 master-0 kubenswrapper[7648]: W0308 03:12:44.780060 7648 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf6ce1a_c203_4033_86be_be16694a9062.slice/crio-e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63 WatchSource:0}: Error finding container e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63: Status 404 returned error can't find the container with id e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63 Mar 08 03:12:45.482293 master-0 kubenswrapper[7648]: I0308 03:12:45.481980 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"2dc664e3-7f37-4fba-8104-544ffb18c1bd","Type":"ContainerStarted","Data":"68c94d100f2836b6f0dea34646419e405565e371a80c5355bfca798f46638f44"} Mar 08 03:12:45.482293 master-0 kubenswrapper[7648]: I0308 03:12:45.482039 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"2dc664e3-7f37-4fba-8104-544ffb18c1bd","Type":"ContainerStarted","Data":"047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9"} Mar 08 03:12:45.485341 master-0 kubenswrapper[7648]: I0308 03:12:45.484928 7648 generic.go:334] "Generic (PLEG): container finished" podID="9cf6ce1a-c203-4033-86be-be16694a9062" containerID="ccf5656bb56a19c7a22e492a44ae1446dc7c5b94a77f84f22b258b7af6805d2a" exitCode=0 Mar 08 03:12:45.485651 master-0 kubenswrapper[7648]: I0308 03:12:45.485392 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qh2" event={"ID":"9cf6ce1a-c203-4033-86be-be16694a9062","Type":"ContainerDied","Data":"ccf5656bb56a19c7a22e492a44ae1446dc7c5b94a77f84f22b258b7af6805d2a"} Mar 08 03:12:45.485651 master-0 kubenswrapper[7648]: I0308 03:12:45.485432 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qh2" event={"ID":"9cf6ce1a-c203-4033-86be-be16694a9062","Type":"ContainerStarted","Data":"e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63"} Mar 08 03:12:45.492456 master-0 kubenswrapper[7648]: I0308 03:12:45.492426 7648 generic.go:334] "Generic (PLEG): container finished" podID="50ab8f71-42b8-4967-8a0b-016647c59a37" containerID="e1d9e093cba9edf2b9fe5ff93e3ebb84d76e14c7ae92e011cb61c2ecdf53de26" exitCode=0 Mar 08 03:12:45.493137 master-0 kubenswrapper[7648]: I0308 03:12:45.492493 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnlct" event={"ID":"50ab8f71-42b8-4967-8a0b-016647c59a37","Type":"ContainerDied","Data":"e1d9e093cba9edf2b9fe5ff93e3ebb84d76e14c7ae92e011cb61c2ecdf53de26"} Mar 08 03:12:45.493954 master-0 kubenswrapper[7648]: I0308 03:12:45.493407 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnlct" event={"ID":"50ab8f71-42b8-4967-8a0b-016647c59a37","Type":"ContainerStarted","Data":"26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c"} Mar 08 03:12:45.499606 master-0 kubenswrapper[7648]: I0308 03:12:45.499545 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=3.499532862 podStartE2EDuration="3.499532862s" podCreationTimestamp="2026-03-08 03:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:45.496649763 +0000 UTC m=+58.107968053" watchObservedRunningTime="2026-03-08 03:12:45.499532862 +0000 UTC m=+58.110851152" Mar 08 03:12:45.500723 master-0 kubenswrapper[7648]: I0308 03:12:45.500350 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" event={"ID":"5a2c9576-f7bd-4ac5-a7fe-530f26642f97","Type":"ContainerStarted","Data":"aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c"} Mar 08 03:12:45.508536 master-0 kubenswrapper[7648]: I0308 03:12:45.508452 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:12:45.508847 master-0 kubenswrapper[7648]: I0308 03:12:45.508796 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerName="installer" containerID="cri-o://d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e" gracePeriod=30 Mar 08 03:12:45.515470 master-0 kubenswrapper[7648]: I0308 03:12:45.515394 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"acb74744-fb99-4663-a7d0-7bae2db205e9","Type":"ContainerStarted","Data":"cc85031403c41701fdfa514e870d79fe56e4ed3f33238513795cdc2323e4fac2"} Mar 08 03:12:45.515470 master-0 kubenswrapper[7648]: I0308 03:12:45.515466 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"acb74744-fb99-4663-a7d0-7bae2db205e9","Type":"ContainerStarted","Data":"a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176"} Mar 08 03:12:45.582403 master-0 kubenswrapper[7648]: I0308 03:12:45.582339 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=3.582323529 podStartE2EDuration="3.582323529s" podCreationTimestamp="2026-03-08 03:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:45.581304284 +0000 UTC m=+58.192622574" watchObservedRunningTime="2026-03-08 03:12:45.582323529 +0000 UTC m=+58.193641819" Mar 08 03:12:45.620362 master-0 kubenswrapper[7648]: I0308 03:12:45.620257 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4acb9694-2b7d-4684-907b-e321d71b3f8a" path="/var/lib/kubelet/pods/4acb9694-2b7d-4684-907b-e321d71b3f8a/volumes" Mar 08 03:12:45.622222 master-0 kubenswrapper[7648]: I0308 03:12:45.620760 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b2988e-7fa3-44ee-be58-51964231a2ab" path="/var/lib/kubelet/pods/b9b2988e-7fa3-44ee-be58-51964231a2ab/volumes" Mar 08 03:12:47.860975 master-0 kubenswrapper[7648]: I0308 03:12:47.860846 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4"] Mar 08 03:12:47.874422 master-0 kubenswrapper[7648]: E0308 03:12:47.872651 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b2988e-7fa3-44ee-be58-51964231a2ab" containerName="installer" Mar 08 03:12:47.874422 master-0 kubenswrapper[7648]: I0308 03:12:47.872698 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b2988e-7fa3-44ee-be58-51964231a2ab" containerName="installer" Mar 08 03:12:47.874422 master-0 kubenswrapper[7648]: I0308 03:12:47.872907 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b2988e-7fa3-44ee-be58-51964231a2ab" containerName="installer" Mar 08 03:12:47.874422 master-0 kubenswrapper[7648]: I0308 03:12:47.873892 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:47.902832 master-0 kubenswrapper[7648]: I0308 03:12:47.902719 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:12:47.902832 master-0 kubenswrapper[7648]: I0308 03:12:47.902757 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zr868" Mar 08 03:12:47.904652 master-0 kubenswrapper[7648]: I0308 03:12:47.903361 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:12:47.904652 master-0 kubenswrapper[7648]: I0308 03:12:47.903385 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:12:47.904652 master-0 kubenswrapper[7648]: I0308 03:12:47.903614 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:12:47.904652 master-0 kubenswrapper[7648]: I0308 03:12:47.903802 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:12:47.963188 master-0 kubenswrapper[7648]: I0308 03:12:47.963085 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:47.963188 master-0 kubenswrapper[7648]: I0308 03:12:47.963131 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:47.963188 master-0 kubenswrapper[7648]: I0308 03:12:47.963165 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:47.963188 master-0 kubenswrapper[7648]: I0308 03:12:47.963184 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.065212 master-0 kubenswrapper[7648]: I0308 03:12:48.065092 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.065212 master-0 kubenswrapper[7648]: I0308 03:12:48.065154 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.065212 master-0 kubenswrapper[7648]: I0308 03:12:48.065203 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.065707 master-0 kubenswrapper[7648]: I0308 03:12:48.065225 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.066447 master-0 kubenswrapper[7648]: I0308 03:12:48.066410 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.069303 master-0 kubenswrapper[7648]: I0308 03:12:48.069279 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.069510 master-0 kubenswrapper[7648]: I0308 03:12:48.069312 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.083928 master-0 kubenswrapper[7648]: I0308 03:12:48.083897 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.581452 master-0 kubenswrapper[7648]: I0308 03:12:48.581384 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:12:48.908205 master-0 kubenswrapper[7648]: I0308 03:12:48.907404 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:12:48.915126 master-0 kubenswrapper[7648]: I0308 03:12:48.914119 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:48.948840 master-0 kubenswrapper[7648]: I0308 03:12:48.948787 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:12:48.983436 master-0 kubenswrapper[7648]: I0308 03:12:48.981744 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" event={"ID":"234638fe-5577-45bc-9094-907c5611da38","Type":"ContainerStarted","Data":"a153dffc082c8d8e34a6c6e6c0c21f4bb223cf1b6ae19843ae82a4a21f8d697f"} Mar 08 03:12:49.048738 master-0 kubenswrapper[7648]: I0308 03:12:49.048650 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.048942 master-0 kubenswrapper[7648]: I0308 03:12:49.048876 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.048942 master-0 kubenswrapper[7648]: I0308 03:12:49.048912 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.071416 master-0 kubenswrapper[7648]: I0308 03:12:49.070674 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq"] Mar 08 03:12:49.073903 master-0 kubenswrapper[7648]: I0308 03:12:49.073764 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.081543 master-0 kubenswrapper[7648]: I0308 03:12:49.080066 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:12:49.081543 master-0 kubenswrapper[7648]: I0308 03:12:49.080289 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:12:49.081758 master-0 kubenswrapper[7648]: I0308 03:12:49.081605 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rdhz7" Mar 08 03:12:49.085206 master-0 kubenswrapper[7648]: I0308 03:12:49.084143 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:12:49.086700 master-0 kubenswrapper[7648]: I0308 03:12:49.086677 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:12:49.093708 master-0 kubenswrapper[7648]: I0308 03:12:49.092360 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq"] Mar 08 03:12:49.149573 master-0 kubenswrapper[7648]: I0308 03:12:49.149463 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.149573 master-0 kubenswrapper[7648]: I0308 03:12:49.149536 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.149771 master-0 kubenswrapper[7648]: I0308 03:12:49.149662 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpwx\" (UniqueName: \"kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.149771 master-0 kubenswrapper[7648]: I0308 03:12:49.149702 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.149771 master-0 kubenswrapper[7648]: I0308 03:12:49.149727 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.149860 master-0 kubenswrapper[7648]: I0308 03:12:49.149777 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.149937 master-0 kubenswrapper[7648]: I0308 03:12:49.149872 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.149974 master-0 kubenswrapper[7648]: I0308 03:12:49.149956 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.174725 master-0 kubenswrapper[7648]: I0308 03:12:49.174674 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.251755 master-0 kubenswrapper[7648]: I0308 03:12:49.251721 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.251841 master-0 kubenswrapper[7648]: I0308 03:12:49.251786 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.251841 master-0 kubenswrapper[7648]: I0308 03:12:49.251827 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpwx\" (UniqueName: \"kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.254608 master-0 kubenswrapper[7648]: I0308 03:12:49.254558 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.262161 master-0 kubenswrapper[7648]: I0308 03:12:49.262103 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.268971 master-0 kubenswrapper[7648]: I0308 03:12:49.268909 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpwx\" (UniqueName: \"kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:49.296221 master-0 kubenswrapper[7648]: I0308 03:12:49.295843 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:12:49.414933 master-0 kubenswrapper[7648]: I0308 03:12:49.414807 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:12:50.020737 master-0 kubenswrapper[7648]: I0308 03:12:50.000128 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" event={"ID":"234638fe-5577-45bc-9094-907c5611da38","Type":"ContainerStarted","Data":"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532"} Mar 08 03:12:50.020737 master-0 kubenswrapper[7648]: I0308 03:12:50.005976 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" event={"ID":"5a2c9576-f7bd-4ac5-a7fe-530f26642f97","Type":"ContainerStarted","Data":"27ab0f00e980c7d4d9fcf7e8c62f276ea49b975eb80fef82536adf6bfc74a796"} Mar 08 03:12:50.029623 master-0 kubenswrapper[7648]: I0308 03:12:50.029535 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" podStartSLOduration=2.172161045 podStartE2EDuration="7.029513147s" podCreationTimestamp="2026-03-08 03:12:43 +0000 UTC" firstStartedPulling="2026-03-08 03:12:44.784434845 +0000 UTC m=+57.395753135" lastFinishedPulling="2026-03-08 03:12:49.641786927 +0000 UTC m=+62.253105237" observedRunningTime="2026-03-08 03:12:50.028801362 +0000 UTC m=+62.640119702" watchObservedRunningTime="2026-03-08 03:12:50.029513147 +0000 UTC m=+62.640831437" Mar 08 03:12:50.085249 master-0 kubenswrapper[7648]: I0308 03:12:50.083869 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq"] Mar 08 03:12:50.130690 master-0 kubenswrapper[7648]: I0308 03:12:50.130290 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 08 03:12:51.702917 master-0 kubenswrapper[7648]: I0308 03:12:51.696204 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0781e6af-f5b5-40f7-bb7f-5bc6978b4957","Type":"ContainerStarted","Data":"5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72"} Mar 08 03:12:51.702917 master-0 kubenswrapper[7648]: I0308 03:12:51.699835 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" event={"ID":"caa3a50c-1291-4152-a48a-f7c7b49627db","Type":"ContainerStarted","Data":"5ae03b9864de5a223907743c7768d67db56fcd741d6926b04cca5e65b9fe842b"} Mar 08 03:12:51.702917 master-0 kubenswrapper[7648]: I0308 03:12:51.699876 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" event={"ID":"caa3a50c-1291-4152-a48a-f7c7b49627db","Type":"ContainerStarted","Data":"5084fee9f78fd3409fb341ad2acde32d4d944cb2a228b141d353e9f022872f48"} Mar 08 03:12:51.873362 master-0 kubenswrapper[7648]: I0308 03:12:51.872906 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm"] Mar 08 03:12:51.875068 master-0 kubenswrapper[7648]: I0308 03:12:51.874234 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:51.889212 master-0 kubenswrapper[7648]: I0308 03:12:51.888103 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:12:51.889212 master-0 kubenswrapper[7648]: I0308 03:12:51.888354 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-f5lxw" Mar 08 03:12:51.889212 master-0 kubenswrapper[7648]: I0308 03:12:51.888735 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:12:51.893076 master-0 kubenswrapper[7648]: I0308 03:12:51.892653 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:12:51.914964 master-0 kubenswrapper[7648]: I0308 03:12:51.914166 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm"] Mar 08 03:12:51.955235 master-0 kubenswrapper[7648]: I0308 03:12:51.955134 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6"] Mar 08 03:12:51.955991 master-0 kubenswrapper[7648]: I0308 03:12:51.955965 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:51.974991 master-0 kubenswrapper[7648]: I0308 03:12:51.973703 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:12:51.974991 master-0 kubenswrapper[7648]: I0308 03:12:51.973911 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tvw7c" Mar 08 03:12:51.974991 master-0 kubenswrapper[7648]: I0308 03:12:51.974117 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:12:51.993995 master-0 kubenswrapper[7648]: I0308 03:12:51.988560 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6"] Mar 08 03:12:52.015718 master-0 kubenswrapper[7648]: I0308 03:12:52.004342 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.015718 master-0 kubenswrapper[7648]: I0308 03:12:52.015668 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.015935 master-0 kubenswrapper[7648]: I0308 03:12:52.015833 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2xh\" (UniqueName: \"kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.015935 master-0 kubenswrapper[7648]: I0308 03:12:52.015884 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.016045 master-0 kubenswrapper[7648]: I0308 03:12:52.015974 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt69c\" (UniqueName: \"kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.037704 master-0 kubenswrapper[7648]: I0308 03:12:52.037593 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:52.038128 master-0 kubenswrapper[7648]: I0308 03:12:52.037942 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" podUID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" containerName="controller-manager" containerID="cri-o://44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e" gracePeriod=30 Mar 08 03:12:52.090513 master-0 kubenswrapper[7648]: I0308 03:12:52.084965 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:52.090513 master-0 kubenswrapper[7648]: I0308 03:12:52.085270 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" podUID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" containerName="route-controller-manager" containerID="cri-o://29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f" gracePeriod=30 Mar 08 03:12:52.125761 master-0 kubenswrapper[7648]: I0308 03:12:52.125722 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.125836 master-0 kubenswrapper[7648]: I0308 03:12:52.125821 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt69c\" (UniqueName: \"kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.125887 master-0 kubenswrapper[7648]: I0308 03:12:52.125876 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.125917 master-0 kubenswrapper[7648]: I0308 03:12:52.125894 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.125950 master-0 kubenswrapper[7648]: I0308 03:12:52.125926 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2xh\" (UniqueName: \"kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.129410 master-0 kubenswrapper[7648]: I0308 03:12:52.129374 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.130079 master-0 kubenswrapper[7648]: I0308 03:12:52.130049 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.130139 master-0 kubenswrapper[7648]: E0308 03:12:52.130112 7648 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: secret "cluster-autoscaler-operator-cert" not found Mar 08 03:12:52.130172 master-0 kubenswrapper[7648]: E0308 03:12:52.130148 7648 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert podName:17eaab63-9ba9-4a4a-891d-a76aa3f03b46 nodeName:}" failed. No retries permitted until 2026-03-08 03:12:52.630135136 +0000 UTC m=+65.241453426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert") pod "cluster-autoscaler-operator-69576476f7-cpnw6" (UID: "17eaab63-9ba9-4a4a-891d-a76aa3f03b46") : secret "cluster-autoscaler-operator-cert" not found Mar 08 03:12:52.181914 master-0 kubenswrapper[7648]: I0308 03:12:52.181736 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt69c\" (UniqueName: \"kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.188190 master-0 kubenswrapper[7648]: I0308 03:12:52.186775 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2xh\" (UniqueName: \"kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:52.198711 master-0 kubenswrapper[7648]: I0308 03:12:52.198642 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb"] Mar 08 03:12:52.199745 master-0 kubenswrapper[7648]: I0308 03:12:52.199701 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:52.204049 master-0 kubenswrapper[7648]: I0308 03:12:52.202833 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-twhrj" Mar 08 03:12:52.204049 master-0 kubenswrapper[7648]: I0308 03:12:52.203044 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:12:52.220417 master-0 kubenswrapper[7648]: I0308 03:12:52.219688 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb"] Mar 08 03:12:52.222856 master-0 kubenswrapper[7648]: I0308 03:12:52.222801 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:12:52.237568 master-0 kubenswrapper[7648]: I0308 03:12:52.237534 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:52.237668 master-0 kubenswrapper[7648]: I0308 03:12:52.237640 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97fm\" (UniqueName: \"kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:52.607160 master-0 kubenswrapper[7648]: I0308 03:12:52.607104 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:52.607350 master-0 kubenswrapper[7648]: I0308 03:12:52.607316 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97fm\" (UniqueName: \"kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:53.232914 master-0 kubenswrapper[7648]: I0308 03:12:53.232800 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:53.242786 master-0 kubenswrapper[7648]: I0308 03:12:53.242704 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:53.250797 master-0 kubenswrapper[7648]: I0308 03:12:53.250760 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:53.251460 master-0 kubenswrapper[7648]: I0308 03:12:53.251429 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" event={"ID":"64620a50-b5de-4b7f-84a3-a2df9d7da9fe","Type":"ContainerDied","Data":"44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e"} Mar 08 03:12:53.252372 master-0 kubenswrapper[7648]: I0308 03:12:53.251061 7648 generic.go:334] "Generic (PLEG): container finished" podID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" containerID="44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e" exitCode=0 Mar 08 03:12:53.275806 master-0 kubenswrapper[7648]: I0308 03:12:53.275752 7648 generic.go:334] "Generic (PLEG): container finished" podID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" containerID="29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f" exitCode=0 Mar 08 03:12:53.275981 master-0 kubenswrapper[7648]: I0308 03:12:53.275838 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" event={"ID":"49384416-21b1-4d87-9ab4-77f0efbb9ff8","Type":"ContainerDied","Data":"29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f"} Mar 08 03:12:53.295345 master-0 kubenswrapper[7648]: I0308 03:12:53.295279 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0781e6af-f5b5-40f7-bb7f-5bc6978b4957","Type":"ContainerStarted","Data":"d6a6af9b5c35efad9748f9601d83a886b44fae8599777699f25ccf5aa2fcd4b8"} Mar 08 03:12:53.353858 master-0 kubenswrapper[7648]: I0308 03:12:53.353791 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=5.35374602 podStartE2EDuration="5.35374602s" podCreationTimestamp="2026-03-08 03:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:53.347175413 +0000 UTC m=+65.958493703" watchObservedRunningTime="2026-03-08 03:12:53.35374602 +0000 UTC m=+65.965064310" Mar 08 03:12:53.407546 master-0 kubenswrapper[7648]: I0308 03:12:53.399971 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97fm\" (UniqueName: \"kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:53.420629 master-0 kubenswrapper[7648]: I0308 03:12:53.420587 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl"] Mar 08 03:12:53.422746 master-0 kubenswrapper[7648]: I0308 03:12:53.422725 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.429996 master-0 kubenswrapper[7648]: I0308 03:12:53.429953 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl"] Mar 08 03:12:53.430850 master-0 kubenswrapper[7648]: I0308 03:12:53.430737 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:12:53.431217 master-0 kubenswrapper[7648]: I0308 03:12:53.431191 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:12:53.431605 master-0 kubenswrapper[7648]: I0308 03:12:53.431562 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:12:53.431605 master-0 kubenswrapper[7648]: I0308 03:12:53.431575 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:12:53.431813 master-0 kubenswrapper[7648]: I0308 03:12:53.431796 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-svw57" Mar 08 03:12:53.461623 master-0 kubenswrapper[7648]: I0308 03:12:53.461407 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtt8w\" (UniqueName: \"kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.461623 master-0 kubenswrapper[7648]: I0308 03:12:53.461452 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.461623 master-0 kubenswrapper[7648]: I0308 03:12:53.461555 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.461623 master-0 kubenswrapper[7648]: I0308 03:12:53.461576 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.472110 master-0 kubenswrapper[7648]: I0308 03:12:53.470276 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-zd6kq"] Mar 08 03:12:53.472110 master-0 kubenswrapper[7648]: I0308 03:12:53.472031 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.480025 master-0 kubenswrapper[7648]: I0308 03:12:53.479988 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:12:53.484820 master-0 kubenswrapper[7648]: I0308 03:12:53.484502 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:12:53.491791 master-0 kubenswrapper[7648]: I0308 03:12:53.484734 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-sbm8j" Mar 08 03:12:53.499538 master-0 kubenswrapper[7648]: I0308 03:12:53.495437 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:12:53.499798 master-0 kubenswrapper[7648]: I0308 03:12:53.484846 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:12:53.499798 master-0 kubenswrapper[7648]: I0308 03:12:53.490331 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:12:53.499931 master-0 kubenswrapper[7648]: I0308 03:12:53.490942 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:12:53.500020 master-0 kubenswrapper[7648]: I0308 03:12:53.493805 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:12:53.500054 master-0 kubenswrapper[7648]: I0308 03:12:53.496268 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-zd6kq"] Mar 08 03:12:53.521804 master-0 kubenswrapper[7648]: I0308 03:12:53.520730 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:12:53.570896 master-0 kubenswrapper[7648]: I0308 03:12:53.570860 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.571105 master-0 kubenswrapper[7648]: I0308 03:12:53.570910 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.574728 master-0 kubenswrapper[7648]: I0308 03:12:53.571600 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.574728 master-0 kubenswrapper[7648]: I0308 03:12:53.571966 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtt8w\" (UniqueName: \"kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.574728 master-0 kubenswrapper[7648]: I0308 03:12:53.572043 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.582766 master-0 kubenswrapper[7648]: I0308 03:12:53.575750 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.586683 master-0 kubenswrapper[7648]: I0308 03:12:53.584364 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.590511 master-0 kubenswrapper[7648]: I0308 03:12:53.589832 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtt8w\" (UniqueName: \"kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.673650 master-0 kubenswrapper[7648]: I0308 03:12:53.673600 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.673809 master-0 kubenswrapper[7648]: I0308 03:12:53.673657 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.673809 master-0 kubenswrapper[7648]: I0308 03:12:53.673677 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.673809 master-0 kubenswrapper[7648]: I0308 03:12:53.673693 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.673809 master-0 kubenswrapper[7648]: I0308 03:12:53.673712 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvzs9\" (UniqueName: \"kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.733105 master-0 kubenswrapper[7648]: I0308 03:12:53.733051 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z"] Mar 08 03:12:53.734654 master-0 kubenswrapper[7648]: I0308 03:12:53.734585 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.738477 master-0 kubenswrapper[7648]: I0308 03:12:53.738408 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:12:53.738575 master-0 kubenswrapper[7648]: I0308 03:12:53.738467 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:12:53.741026 master-0 kubenswrapper[7648]: I0308 03:12:53.739818 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-rkgwg" Mar 08 03:12:53.741026 master-0 kubenswrapper[7648]: I0308 03:12:53.740617 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:12:53.741026 master-0 kubenswrapper[7648]: I0308 03:12:53.740644 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:12:53.741026 master-0 kubenswrapper[7648]: I0308 03:12:53.740669 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774534 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774592 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774618 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774637 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774746 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774791 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvzs9\" (UniqueName: \"kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774820 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774842 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774863 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.774874 master-0 kubenswrapper[7648]: I0308 03:12:53.774887 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.776739 master-0 kubenswrapper[7648]: I0308 03:12:53.775726 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.776739 master-0 kubenswrapper[7648]: I0308 03:12:53.776698 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.777603 master-0 kubenswrapper[7648]: I0308 03:12:53.777535 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.780418 master-0 kubenswrapper[7648]: I0308 03:12:53.780381 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.785554 master-0 kubenswrapper[7648]: I0308 03:12:53.784989 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:53.800329 master-0 kubenswrapper[7648]: I0308 03:12:53.800294 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvzs9\" (UniqueName: \"kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.814256 master-0 kubenswrapper[7648]: I0308 03:12:53.814191 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:12:53.825896 master-0 kubenswrapper[7648]: I0308 03:12:53.825847 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.837298 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.841887 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: E0308 03:12:53.842211 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" containerName="controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.842228 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" containerName="controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: E0308 03:12:53.842251 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" containerName="route-controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.842260 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" containerName="route-controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.842511 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" containerName="controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.842535 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" containerName="route-controller-manager" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.843002 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.848438 master-0 kubenswrapper[7648]: I0308 03:12:53.846749 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.877814 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config\") pod \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.877869 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca\") pod \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.877934 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca\") pod \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.877983 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ml8w\" (UniqueName: \"kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w\") pod \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.878012 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert\") pod \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " Mar 08 03:12:53.878075 master-0 kubenswrapper[7648]: I0308 03:12:53.878058 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config\") pod \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.879047 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca" (OuterVolumeSpecName: "client-ca") pod "64620a50-b5de-4b7f-84a3-a2df9d7da9fe" (UID: "64620a50-b5de-4b7f-84a3-a2df9d7da9fe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.879440 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config" (OuterVolumeSpecName: "config") pod "64620a50-b5de-4b7f-84a3-a2df9d7da9fe" (UID: "64620a50-b5de-4b7f-84a3-a2df9d7da9fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.879922 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config" (OuterVolumeSpecName: "config") pod "49384416-21b1-4d87-9ab4-77f0efbb9ff8" (UID: "49384416-21b1-4d87-9ab4-77f0efbb9ff8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.879962 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert\") pod \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.879995 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles\") pod \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\" (UID: \"64620a50-b5de-4b7f-84a3-a2df9d7da9fe\") " Mar 08 03:12:53.880298 master-0 kubenswrapper[7648]: I0308 03:12:53.880020 7648 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msq9j\" (UniqueName: \"kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j\") pod \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\" (UID: \"49384416-21b1-4d87-9ab4-77f0efbb9ff8\") " Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880390 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "64620a50-b5de-4b7f-84a3-a2df9d7da9fe" (UID: "64620a50-b5de-4b7f-84a3-a2df9d7da9fe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880668 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880746 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880812 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880866 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.880952 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.881138 master-0 kubenswrapper[7648]: I0308 03:12:53.881135 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881304 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881312 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881825 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881861 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881965 7648 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881982 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.881991 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.882000 7648 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.883075 master-0 kubenswrapper[7648]: I0308 03:12:53.882055 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.883829 master-0 kubenswrapper[7648]: I0308 03:12:53.883251 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.883829 master-0 kubenswrapper[7648]: I0308 03:12:53.883756 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca" (OuterVolumeSpecName: "client-ca") pod "49384416-21b1-4d87-9ab4-77f0efbb9ff8" (UID: "49384416-21b1-4d87-9ab4-77f0efbb9ff8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:12:53.884213 master-0 kubenswrapper[7648]: I0308 03:12:53.884183 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w" (OuterVolumeSpecName: "kube-api-access-7ml8w") pod "64620a50-b5de-4b7f-84a3-a2df9d7da9fe" (UID: "64620a50-b5de-4b7f-84a3-a2df9d7da9fe"). InnerVolumeSpecName "kube-api-access-7ml8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:53.906472 master-0 kubenswrapper[7648]: I0308 03:12:53.906408 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.909028 master-0 kubenswrapper[7648]: I0308 03:12:53.908998 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm"] Mar 08 03:12:53.912698 master-0 kubenswrapper[7648]: I0308 03:12:53.912662 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "64620a50-b5de-4b7f-84a3-a2df9d7da9fe" (UID: "64620a50-b5de-4b7f-84a3-a2df9d7da9fe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:53.912822 master-0 kubenswrapper[7648]: I0308 03:12:53.912801 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j" (OuterVolumeSpecName: "kube-api-access-msq9j") pod "49384416-21b1-4d87-9ab4-77f0efbb9ff8" (UID: "49384416-21b1-4d87-9ab4-77f0efbb9ff8"). InnerVolumeSpecName "kube-api-access-msq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:12:53.912932 master-0 kubenswrapper[7648]: I0308 03:12:53.912885 7648 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49384416-21b1-4d87-9ab4-77f0efbb9ff8" (UID: "49384416-21b1-4d87-9ab4-77f0efbb9ff8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:12:53.913469 master-0 kubenswrapper[7648]: I0308 03:12:53.913439 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.982885 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.982935 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983227 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983263 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983341 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983355 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msq9j\" (UniqueName: \"kubernetes.io/projected/49384416-21b1-4d87-9ab4-77f0efbb9ff8-kube-api-access-msq9j\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983367 7648 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49384416-21b1-4d87-9ab4-77f0efbb9ff8-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983406 7648 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ml8w\" (UniqueName: \"kubernetes.io/projected/64620a50-b5de-4b7f-84a3-a2df9d7da9fe-kube-api-access-7ml8w\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.983749 master-0 kubenswrapper[7648]: I0308 03:12:53.983440 7648 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49384416-21b1-4d87-9ab4-77f0efbb9ff8-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:12:53.986785 master-0 kubenswrapper[7648]: I0308 03:12:53.984069 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.986785 master-0 kubenswrapper[7648]: I0308 03:12:53.984139 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:53.992857 master-0 kubenswrapper[7648]: I0308 03:12:53.992093 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:54.016976 master-0 kubenswrapper[7648]: I0308 03:12:54.015291 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:54.051330 master-0 kubenswrapper[7648]: I0308 03:12:54.047239 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6"] Mar 08 03:12:54.061317 master-0 kubenswrapper[7648]: I0308 03:12:54.059754 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:12:54.068530 master-0 kubenswrapper[7648]: I0308 03:12:54.067702 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb"] Mar 08 03:12:54.225729 master-0 kubenswrapper[7648]: I0308 03:12:54.225335 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:54.307199 master-0 kubenswrapper[7648]: I0308 03:12:54.306613 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" Mar 08 03:12:54.307658 master-0 kubenswrapper[7648]: I0308 03:12:54.306669 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg" event={"ID":"49384416-21b1-4d87-9ab4-77f0efbb9ff8","Type":"ContainerDied","Data":"0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85"} Mar 08 03:12:54.307658 master-0 kubenswrapper[7648]: I0308 03:12:54.307271 7648 scope.go:117] "RemoveContainer" containerID="29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f" Mar 08 03:12:54.308889 master-0 kubenswrapper[7648]: I0308 03:12:54.308553 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" event={"ID":"0e569889-4759-4046-b0ed-e550078521c6","Type":"ContainerStarted","Data":"b4547ec615a98dfc1a4d3f423cf139e6774712d10eb4a01d3d753a13dcc2d3fd"} Mar 08 03:12:54.313274 master-0 kubenswrapper[7648]: I0308 03:12:54.313217 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerStarted","Data":"51ffe7f291da0631660c368ac239e0d75e0ce758bed86145baf4f7cdab100a2d"} Mar 08 03:12:54.313360 master-0 kubenswrapper[7648]: I0308 03:12:54.313275 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerStarted","Data":"7ca736d5c0f4dfec580e9c43c992c5b2d64a84418a876f71fc7a9325b6f9c563"} Mar 08 03:12:54.314589 master-0 kubenswrapper[7648]: I0308 03:12:54.314544 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" event={"ID":"64620a50-b5de-4b7f-84a3-a2df9d7da9fe","Type":"ContainerDied","Data":"f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714"} Mar 08 03:12:54.314700 master-0 kubenswrapper[7648]: I0308 03:12:54.314667 7648 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55644b7446-5ckmr" Mar 08 03:12:54.322029 master-0 kubenswrapper[7648]: I0308 03:12:54.321994 7648 scope.go:117] "RemoveContainer" containerID="44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e" Mar 08 03:12:54.322155 master-0 kubenswrapper[7648]: I0308 03:12:54.322096 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"bf0f6570fe84a3058c6d6122c2d052c6c8b6d42a5f14c4cbfb5452cbc6866dd1"} Mar 08 03:12:54.330409 master-0 kubenswrapper[7648]: I0308 03:12:54.329519 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" event={"ID":"33c15b06-a21e-411f-b324-3ae0c7f0e9a4","Type":"ContainerStarted","Data":"11a698247fc6f9c54a51413289ea242dad54ebc3d1193e702f60f1dd98b867ee"} Mar 08 03:12:54.331496 master-0 kubenswrapper[7648]: I0308 03:12:54.331106 7648 generic.go:334] "Generic (PLEG): container finished" podID="982ea338-c7be-4776-9bb7-113834c54aaa" containerID="5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56" exitCode=0 Mar 08 03:12:54.331756 master-0 kubenswrapper[7648]: I0308 03:12:54.331718 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" event={"ID":"982ea338-c7be-4776-9bb7-113834c54aaa","Type":"ContainerDied","Data":"5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56"} Mar 08 03:12:54.331999 master-0 kubenswrapper[7648]: I0308 03:12:54.331979 7648 scope.go:117] "RemoveContainer" containerID="5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56" Mar 08 03:12:54.381773 master-0 kubenswrapper[7648]: I0308 03:12:54.378977 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl"] Mar 08 03:12:54.424013 master-0 kubenswrapper[7648]: I0308 03:12:54.423926 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-qt654"] Mar 08 03:12:54.425106 master-0 kubenswrapper[7648]: I0308 03:12:54.425072 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.426973 master-0 kubenswrapper[7648]: I0308 03:12:54.426947 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-r4dpg" Mar 08 03:12:54.428559 master-0 kubenswrapper[7648]: I0308 03:12:54.428527 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:12:54.429117 master-0 kubenswrapper[7648]: I0308 03:12:54.429099 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:12:54.429230 master-0 kubenswrapper[7648]: I0308 03:12:54.429135 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:12:54.458091 master-0 kubenswrapper[7648]: I0308 03:12:54.457977 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-zd6kq"] Mar 08 03:12:54.463438 master-0 kubenswrapper[7648]: I0308 03:12:54.463396 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-qt654"] Mar 08 03:12:54.468426 master-0 kubenswrapper[7648]: I0308 03:12:54.468381 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:54.479466 master-0 kubenswrapper[7648]: I0308 03:12:54.478756 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55644b7446-5ckmr"] Mar 08 03:12:54.500960 master-0 kubenswrapper[7648]: I0308 03:12:54.500310 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:54.501989 master-0 kubenswrapper[7648]: I0308 03:12:54.501863 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.501989 master-0 kubenswrapper[7648]: I0308 03:12:54.501921 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27xv\" (UniqueName: \"kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.501989 master-0 kubenswrapper[7648]: I0308 03:12:54.501946 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.502100 master-0 kubenswrapper[7648]: I0308 03:12:54.501999 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.509681 master-0 kubenswrapper[7648]: I0308 03:12:54.508299 7648 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5696bdc5b4-tqtbg"] Mar 08 03:12:54.603541 master-0 kubenswrapper[7648]: I0308 03:12:54.603016 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.603541 master-0 kubenswrapper[7648]: I0308 03:12:54.603074 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27xv\" (UniqueName: \"kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.603541 master-0 kubenswrapper[7648]: I0308 03:12:54.603093 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.603541 master-0 kubenswrapper[7648]: I0308 03:12:54.603124 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.606101 master-0 kubenswrapper[7648]: I0308 03:12:54.604058 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.606101 master-0 kubenswrapper[7648]: I0308 03:12:54.606041 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.622726 master-0 kubenswrapper[7648]: I0308 03:12:54.622693 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27xv\" (UniqueName: \"kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.631213 master-0 kubenswrapper[7648]: I0308 03:12:54.631151 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:12:54.638341 master-0 kubenswrapper[7648]: I0308 03:12:54.638304 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:54.826593 master-0 kubenswrapper[7648]: I0308 03:12:54.826540 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:12:55.309524 master-0 kubenswrapper[7648]: I0308 03:12:55.309472 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-qt654"] Mar 08 03:12:55.344271 master-0 kubenswrapper[7648]: I0308 03:12:55.344230 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" event={"ID":"982ea338-c7be-4776-9bb7-113834c54aaa","Type":"ContainerStarted","Data":"6a754fcfb0d67c328aad3537f5cd3aea4c5a542bc823d6a29cf5e7022aa42ed0"} Mar 08 03:12:55.348080 master-0 kubenswrapper[7648]: I0308 03:12:55.347748 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" event={"ID":"fe33f926-9348-4498-a892-d2becaeecc14","Type":"ContainerStarted","Data":"ad9c43aa5408a95caea805bac5b0b53795c025374c434eaed751f9b6d9a04367"} Mar 08 03:12:55.348080 master-0 kubenswrapper[7648]: I0308 03:12:55.347802 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" event={"ID":"fe33f926-9348-4498-a892-d2becaeecc14","Type":"ContainerStarted","Data":"a31cf751005d98b0c093a07cba9d36fdd0b091f0fc3e6728bcde1b51934cdbef"} Mar 08 03:12:55.348080 master-0 kubenswrapper[7648]: I0308 03:12:55.347817 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" event={"ID":"fe33f926-9348-4498-a892-d2becaeecc14","Type":"ContainerStarted","Data":"09504f5af1d7c056fa184727bb790ba83f7a308b15d1c9ebf34076ea08bbf988"} Mar 08 03:12:55.351285 master-0 kubenswrapper[7648]: I0308 03:12:55.351241 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerStarted","Data":"ad59cc4c7958a82cb7e8357828383997f6ce39b4d62e09c7ada95209a7513c90"} Mar 08 03:12:55.351285 master-0 kubenswrapper[7648]: I0308 03:12:55.351280 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerStarted","Data":"9c380d8376b93cf0d471da9a093b8dab4577d756ac31e0b75746f35b913cbd11"} Mar 08 03:12:56.098754 master-0 kubenswrapper[7648]: I0308 03:12:55.352045 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:56.098754 master-0 kubenswrapper[7648]: I0308 03:12:55.353694 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerStarted","Data":"cc2c80df51f36394dc9dae6d283de990f51bae7d557e41ccd7171b38db33c170"} Mar 08 03:12:56.098754 master-0 kubenswrapper[7648]: I0308 03:12:55.356602 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:12:56.137690 master-0 kubenswrapper[7648]: I0308 03:12:56.136993 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49384416-21b1-4d87-9ab4-77f0efbb9ff8" path="/var/lib/kubelet/pods/49384416-21b1-4d87-9ab4-77f0efbb9ff8/volumes" Mar 08 03:12:56.142073 master-0 kubenswrapper[7648]: I0308 03:12:56.138552 7648 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64620a50-b5de-4b7f-84a3-a2df9d7da9fe" path="/var/lib/kubelet/pods/64620a50-b5de-4b7f-84a3-a2df9d7da9fe/volumes" Mar 08 03:12:56.181390 master-0 kubenswrapper[7648]: I0308 03:12:56.178415 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:12:56.181390 master-0 kubenswrapper[7648]: I0308 03:12:56.179706 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.197350 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l8646" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.197578 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.197742 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.200810 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.201104 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.201352 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.201885 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.201926 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.202040 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.202074 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.204933 master-0 kubenswrapper[7648]: I0308 03:12:56.202127 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.206287 master-0 kubenswrapper[7648]: I0308 03:12:56.205972 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podStartSLOduration=4.205955433 podStartE2EDuration="4.205955433s" podCreationTimestamp="2026-03-08 03:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:56.156016172 +0000 UTC m=+68.767334462" watchObservedRunningTime="2026-03-08 03:12:56.205955433 +0000 UTC m=+68.817273723" Mar 08 03:12:56.213392 master-0 kubenswrapper[7648]: I0308 03:12:56.207007 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:12:56.213392 master-0 kubenswrapper[7648]: I0308 03:12:56.208853 7648 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:12:56.216774 master-0 kubenswrapper[7648]: I0308 03:12:56.216379 7648 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" podStartSLOduration=3.216355361 podStartE2EDuration="3.216355361s" podCreationTimestamp="2026-03-08 03:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:12:56.197962937 +0000 UTC m=+68.809281237" watchObservedRunningTime="2026-03-08 03:12:56.216355361 +0000 UTC m=+68.827673661" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.304756 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.304830 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.304849 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.304872 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.304927 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.309696 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.310351 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.324751 master-0 kubenswrapper[7648]: I0308 03:12:56.319180 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.340544 master-0 kubenswrapper[7648]: I0308 03:12:56.335461 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.345173 master-0 kubenswrapper[7648]: I0308 03:12:56.330951 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:56.571704 master-0 kubenswrapper[7648]: I0308 03:12:56.569678 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:12:58.064210 master-0 kubenswrapper[7648]: I0308 03:12:58.064074 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j6n9g"] Mar 08 03:12:58.065035 master-0 kubenswrapper[7648]: I0308 03:12:58.064942 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.067306 master-0 kubenswrapper[7648]: I0308 03:12:58.067257 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:12:58.166710 master-0 kubenswrapper[7648]: I0308 03:12:58.159809 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.166710 master-0 kubenswrapper[7648]: I0308 03:12:58.159880 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.166710 master-0 kubenswrapper[7648]: I0308 03:12:58.159975 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.166710 master-0 kubenswrapper[7648]: I0308 03:12:58.160008 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwh7\" (UniqueName: \"kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.260502 master-0 kubenswrapper[7648]: I0308 03:12:58.260420 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.260502 master-0 kubenswrapper[7648]: I0308 03:12:58.260497 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwh7\" (UniqueName: \"kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.261347 master-0 kubenswrapper[7648]: I0308 03:12:58.260540 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.261347 master-0 kubenswrapper[7648]: I0308 03:12:58.260590 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.261347 master-0 kubenswrapper[7648]: I0308 03:12:58.260601 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.261509 master-0 kubenswrapper[7648]: I0308 03:12:58.261438 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.283824 master-0 kubenswrapper[7648]: I0308 03:12:58.283558 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.331163 master-0 kubenswrapper[7648]: I0308 03:12:58.331073 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwh7\" (UniqueName: \"kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:12:58.507297 master-0 kubenswrapper[7648]: I0308 03:12:58.507249 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:13:05.435171 master-0 kubenswrapper[7648]: I0308 03:13:05.435128 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/0.log" Mar 08 03:13:05.435924 master-0 kubenswrapper[7648]: I0308 03:13:05.435192 7648 generic.go:334] "Generic (PLEG): container finished" podID="70fba73e-c201-4866-bc69-64892ea5bdca" containerID="7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e" exitCode=1 Mar 08 03:13:05.435924 master-0 kubenswrapper[7648]: I0308 03:13:05.435869 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerDied","Data":"7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e"} Mar 08 03:13:05.436493 master-0 kubenswrapper[7648]: I0308 03:13:05.436456 7648 scope.go:117] "RemoveContainer" containerID="7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e" Mar 08 03:13:05.436772 master-0 kubenswrapper[7648]: I0308 03:13:05.436721 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" event={"ID":"c3729e29-4c57-4f9b-8202-a87fd3a9a722","Type":"ContainerStarted","Data":"33074446cdbc88121d8e124ce9ef4086de417c43c65ed95b23d3c327d6455998"} Mar 08 03:13:06.101824 master-0 kubenswrapper[7648]: I0308 03:13:06.101778 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-f85rr"] Mar 08 03:13:06.102952 master-0 kubenswrapper[7648]: I0308 03:13:06.102931 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.110653 master-0 kubenswrapper[7648]: I0308 03:13:06.110026 7648 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-29dgn" Mar 08 03:13:06.129881 master-0 kubenswrapper[7648]: I0308 03:13:06.129835 7648 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-f85rr"] Mar 08 03:13:06.246323 master-0 kubenswrapper[7648]: I0308 03:13:06.246241 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.246323 master-0 kubenswrapper[7648]: I0308 03:13:06.246308 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpd47\" (UniqueName: \"kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.253340 master-0 kubenswrapper[7648]: E0308 03:13:06.253277 7648 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadbbe97_2a03_40da_846d_252e29661f67.slice/crio-conmon-dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64620a50_b5de_4b7f_84a3_a2df9d7da9fe.slice/crio-44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0192f3_2e60_42c6_9836_c70a9fa407d5.slice/crio-conmon-2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64620a50_b5de_4b7f_84a3_a2df9d7da9fe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64620a50_b5de_4b7f_84a3_a2df9d7da9fe.slice/crio-f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6432d23b_a55a_4131_83d5_5f16419809dd.slice/crio-d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982ea338_c7be_4776_9bb7_113834c54aaa.slice/crio-conmon-5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64620a50_b5de_4b7f_84a3_a2df9d7da9fe.slice/crio-conmon-44c51ef09a1e4d640e56c548d7e597beadc76bae6917d92d87ab66115296db6e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49384416_21b1_4d87_9ab4_77f0efbb9ff8.slice/crio-conmon-29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0192f3_2e60_42c6_9836_c70a9fa407d5.slice/crio-2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49384416_21b1_4d87_9ab4_77f0efbb9ff8.slice/crio-29ec3831341bb36e2171a6456449c15abed4681f1b44070d0757301e9477938f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982ea338_c7be_4776_9bb7_113834c54aaa.slice/crio-5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49384416_21b1_4d87_9ab4_77f0efbb9ff8.slice/crio-0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70fba73e_c201_4866_bc69_64892ea5bdca.slice/crio-7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6432d23b_a55a_4131_83d5_5f16419809dd.slice/crio-conmon-d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pode81d3c37_e8d7_44c8_973e_13992380ce85.slice/crio-conmon-d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3178dfc0_a35e_418e_a954_cd919b8af88c.slice/crio-a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pode81d3c37_e8d7_44c8_973e_13992380ce85.slice/crio-d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3178dfc0_a35e_418e_a954_cd919b8af88c.slice/crio-conmon-a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71caa06_6ce7_47c9_a267_21f6b6af9247.slice/crio-5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70fba73e_c201_4866_bc69_64892ea5bdca.slice/crio-conmon-7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e.scope\": RecentStats: unable to find data in memory cache]" Mar 08 03:13:06.347853 master-0 kubenswrapper[7648]: I0308 03:13:06.347800 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpd47\" (UniqueName: \"kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.348027 master-0 kubenswrapper[7648]: I0308 03:13:06.347885 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.358163 master-0 kubenswrapper[7648]: I0308 03:13:06.358119 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.362089 master-0 kubenswrapper[7648]: I0308 03:13:06.362046 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpd47\" (UniqueName: \"kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:06.431059 master-0 kubenswrapper[7648]: I0308 03:13:06.430998 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:13:07.109512 master-0 kubenswrapper[7648]: I0308 03:13:07.109405 7648 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4"] Mar 08 03:13:09.799798 master-0 kubenswrapper[7648]: I0308 03:13:09.799687 7648 generic.go:334] "Generic (PLEG): container finished" podID="8c0192f3-2e60-42c6-9836-c70a9fa407d5" containerID="2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9" exitCode=0 Mar 08 03:13:09.804341 master-0 kubenswrapper[7648]: I0308 03:13:09.803616 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerDied","Data":"2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9"} Mar 08 03:13:09.804341 master-0 kubenswrapper[7648]: I0308 03:13:09.804291 7648 scope.go:117] "RemoveContainer" containerID="2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9" Mar 08 03:13:10.681053 master-0 kubenswrapper[7648]: I0308 03:13:10.680886 7648 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:13:10.978274 master-0 kubenswrapper[7648]: I0308 03:13:10.978239 7648 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:13:10.978866 master-0 kubenswrapper[7648]: I0308 03:13:10.978848 7648 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:13:10.979026 master-0 kubenswrapper[7648]: I0308 03:13:10.978995 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:10.979075 master-0 kubenswrapper[7648]: I0308 03:13:10.979049 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://66dcb2ef9f56c8175e9938f33a7650abc0b5ef0e638ee33a15fd5eee5cc90aba" gracePeriod=15 Mar 08 03:13:10.979143 master-0 kubenswrapper[7648]: I0308 03:13:10.979102 7648 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb" gracePeriod=15 Mar 08 03:13:10.980679 master-0 kubenswrapper[7648]: I0308 03:13:10.980606 7648 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:13:10.981058 master-0 kubenswrapper[7648]: E0308 03:13:10.981038 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:13:10.981099 master-0 kubenswrapper[7648]: I0308 03:13:10.981062 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:13:10.981132 master-0 kubenswrapper[7648]: E0308 03:13:10.981121 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:13:10.981168 master-0 kubenswrapper[7648]: I0308 03:13:10.981132 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:13:10.981168 master-0 kubenswrapper[7648]: E0308 03:13:10.981157 7648 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:13:10.981168 master-0 kubenswrapper[7648]: I0308 03:13:10.981167 7648 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:13:10.981448 master-0 kubenswrapper[7648]: I0308 03:13:10.981373 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:13:10.981448 master-0 kubenswrapper[7648]: I0308 03:13:10.981446 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:13:10.981545 master-0 kubenswrapper[7648]: I0308 03:13:10.981466 7648 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:13:10.984363 master-0 kubenswrapper[7648]: I0308 03:13:10.984290 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.117644 master-0 kubenswrapper[7648]: I0308 03:13:11.117589 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.117832 master-0 kubenswrapper[7648]: I0308 03:13:11.117656 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.117832 master-0 kubenswrapper[7648]: I0308 03:13:11.117690 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.117832 master-0 kubenswrapper[7648]: I0308 03:13:11.117708 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.117832 master-0 kubenswrapper[7648]: I0308 03:13:11.117731 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.117832 master-0 kubenswrapper[7648]: I0308 03:13:11.117747 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.118003 master-0 kubenswrapper[7648]: I0308 03:13:11.117980 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.118038 master-0 kubenswrapper[7648]: I0308 03:13:11.118009 7648 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.219558 master-0 kubenswrapper[7648]: I0308 03:13:11.219441 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.219558 master-0 kubenswrapper[7648]: I0308 03:13:11.219585 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.219558 master-0 kubenswrapper[7648]: I0308 03:13:11.219601 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.219558 master-0 kubenswrapper[7648]: I0308 03:13:11.219629 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.220240 master-0 kubenswrapper[7648]: I0308 03:13:11.219700 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.220240 master-0 kubenswrapper[7648]: I0308 03:13:11.219731 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220240 master-0 kubenswrapper[7648]: I0308 03:13:11.219857 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220240 master-0 kubenswrapper[7648]: I0308 03:13:11.219993 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220240 master-0 kubenswrapper[7648]: I0308 03:13:11.220188 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220510 master-0 kubenswrapper[7648]: I0308 03:13:11.220407 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220510 master-0 kubenswrapper[7648]: I0308 03:13:11.220449 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.220621 master-0 kubenswrapper[7648]: I0308 03:13:11.220572 7648 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220724 master-0 kubenswrapper[7648]: I0308 03:13:11.220690 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220795 master-0 kubenswrapper[7648]: I0308 03:13:11.220735 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220795 master-0 kubenswrapper[7648]: I0308 03:13:11.220785 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.220922 master-0 kubenswrapper[7648]: I0308 03:13:11.220828 7648 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: I0308 03:13:11.257850 7648 patch_prober.go:28] interesting pod/bootstrap-kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]log ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]api-openshift-apiserver-available ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]api-openshift-oauth-apiserver-available ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]informer-sync ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/generic-apiserver-start-informers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/priority-and-fairness-filter ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-apiextensions-informers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-apiextensions-controllers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/crd-informer-synced ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-system-namespaces-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/rbac/bootstrap-roles ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/bootstrap-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/start-kube-aggregator-informers ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-registration-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-discovery-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]autoregister-completion ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-openapi-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: [-]shutdown failed: reason withheld Mar 08 03:13:11.257913 master-0 kubenswrapper[7648]: readyz check failed Mar 08 03:13:11.259089 master-0 kubenswrapper[7648]: I0308 03:13:11.257933 7648 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 03:13:11.627360 master-0 kubenswrapper[7648]: I0308 03:13:11.627307 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:11.631373 master-0 kubenswrapper[7648]: I0308 03:13:11.631313 7648 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:11.647881 master-0 kubenswrapper[7648]: I0308 03:13:11.646356 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:13:11.647881 master-0 kubenswrapper[7648]: I0308 03:13:11.646399 7648 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:13:11.813357 master-0 kubenswrapper[7648]: I0308 03:13:11.813269 7648 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_e81d3c37-e8d7-44c8-973e-13992380ce85/installer/0.log" Mar 08 03:13:11.813616 master-0 kubenswrapper[7648]: I0308 03:13:11.813583 7648 generic.go:334] "Generic (PLEG): container finished" podID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerID="d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e" exitCode=1 Mar 08 03:13:11.813747 master-0 kubenswrapper[7648]: I0308 03:13:11.813703 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"e81d3c37-e8d7-44c8-973e-13992380ce85","Type":"ContainerDied","Data":"d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e"} Mar 08 03:13:11.951953 master-0 kubenswrapper[7648]: I0308 03:13:11.951905 7648 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-zqlnx container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" start-of-body= Mar 08 03:13:11.952165 master-0 kubenswrapper[7648]: I0308 03:13:11.951976 7648 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.5:8443/healthz\": dial tcp 10.128.0.5:8443: connect: connection refused" Mar 08 03:13:12.823100 master-0 kubenswrapper[7648]: I0308 03:13:12.822031 7648 generic.go:334] "Generic (PLEG): container finished" podID="3178dfc0-a35e-418e-a954-cd919b8af88c" containerID="a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0" exitCode=0 Mar 08 03:13:12.823100 master-0 kubenswrapper[7648]: I0308 03:13:12.822111 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerDied","Data":"a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0"} Mar 08 03:13:12.823100 master-0 kubenswrapper[7648]: I0308 03:13:12.822684 7648 scope.go:117] "RemoveContainer" containerID="a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0" Mar 08 03:13:12.824099 master-0 kubenswrapper[7648]: I0308 03:13:12.824037 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.824359 master-0 kubenswrapper[7648]: I0308 03:13:12.824300 7648 generic.go:334] "Generic (PLEG): container finished" podID="aadbbe97-2a03-40da-846d-252e29661f67" containerID="dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62" exitCode=0 Mar 08 03:13:12.824406 master-0 kubenswrapper[7648]: I0308 03:13:12.824375 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerDied","Data":"dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62"} Mar 08 03:13:12.824644 master-0 kubenswrapper[7648]: I0308 03:13:12.824602 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.825134 master-0 kubenswrapper[7648]: I0308 03:13:12.825086 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.825272 master-0 kubenswrapper[7648]: I0308 03:13:12.825234 7648 scope.go:117] "RemoveContainer" containerID="dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62" Mar 08 03:13:12.825702 master-0 kubenswrapper[7648]: I0308 03:13:12.825654 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.826814 master-0 kubenswrapper[7648]: I0308 03:13:12.826719 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.827691 master-0 kubenswrapper[7648]: I0308 03:13:12.827653 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:12.828731 master-0 kubenswrapper[7648]: I0308 03:13:12.828664 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.834753 master-0 kubenswrapper[7648]: I0308 03:13:14.834704 7648 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf" exitCode=0 Mar 08 03:13:14.835287 master-0 kubenswrapper[7648]: I0308 03:13:14.834775 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerDied","Data":"37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf"} Mar 08 03:13:14.835398 master-0 kubenswrapper[7648]: I0308 03:13:14.835372 7648 scope.go:117] "RemoveContainer" containerID="37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf" Mar 08 03:13:14.836620 master-0 kubenswrapper[7648]: I0308 03:13:14.836590 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.837023 master-0 kubenswrapper[7648]: I0308 03:13:14.836997 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.837434 master-0 kubenswrapper[7648]: I0308 03:13:14.837384 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.837978 master-0 kubenswrapper[7648]: I0308 03:13:14.837955 7648 generic.go:334] "Generic (PLEG): container finished" podID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" containerID="d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386" exitCode=0 Mar 08 03:13:14.838121 master-0 kubenswrapper[7648]: I0308 03:13:14.838007 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerDied","Data":"d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386"} Mar 08 03:13:14.838121 master-0 kubenswrapper[7648]: I0308 03:13:14.837987 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.838462 master-0 kubenswrapper[7648]: I0308 03:13:14.838439 7648 scope.go:117] "RemoveContainer" containerID="d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386" Mar 08 03:13:14.838839 master-0 kubenswrapper[7648]: I0308 03:13:14.838804 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.839340 master-0 kubenswrapper[7648]: I0308 03:13:14.839302 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.840469 master-0 kubenswrapper[7648]: I0308 03:13:14.840261 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.840809 master-0 kubenswrapper[7648]: I0308 03:13:14.840778 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.841156 master-0 kubenswrapper[7648]: I0308 03:13:14.841134 7648 generic.go:334] "Generic (PLEG): container finished" podID="e71caa06-6ce7-47c9-a267-21f6b6af9247" containerID="5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2" exitCode=0 Mar 08 03:13:14.841228 master-0 kubenswrapper[7648]: I0308 03:13:14.841181 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerDied","Data":"5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2"} Mar 08 03:13:14.841527 master-0 kubenswrapper[7648]: I0308 03:13:14.841508 7648 scope.go:117] "RemoveContainer" containerID="5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2" Mar 08 03:13:14.842413 master-0 kubenswrapper[7648]: I0308 03:13:14.842206 7648 status_manager.go:851] "Failed to get status for pod" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-7c6989d6c4-zqlnx\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.842990 master-0 kubenswrapper[7648]: I0308 03:13:14.842738 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.843657 master-0 kubenswrapper[7648]: I0308 03:13:14.843413 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.843979 master-0 kubenswrapper[7648]: I0308 03:13:14.843943 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.844571 master-0 kubenswrapper[7648]: I0308 03:13:14.844187 7648 generic.go:334] "Generic (PLEG): container finished" podID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" containerID="1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22" exitCode=0 Mar 08 03:13:14.844571 master-0 kubenswrapper[7648]: I0308 03:13:14.844230 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerDied","Data":"1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22"} Mar 08 03:13:14.844681 master-0 kubenswrapper[7648]: I0308 03:13:14.844643 7648 scope.go:117] "RemoveContainer" containerID="1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22" Mar 08 03:13:14.845150 master-0 kubenswrapper[7648]: I0308 03:13:14.845005 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.845609 master-0 kubenswrapper[7648]: I0308 03:13:14.845564 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.846377 master-0 kubenswrapper[7648]: I0308 03:13:14.846096 7648 status_manager.go:851] "Failed to get status for pod" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-7c6989d6c4-zqlnx\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.846830 master-0 kubenswrapper[7648]: I0308 03:13:14.846795 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.847098 master-0 kubenswrapper[7648]: I0308 03:13:14.847015 7648 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb" exitCode=0 Mar 08 03:13:14.847646 master-0 kubenswrapper[7648]: I0308 03:13:14.847608 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.848947 master-0 kubenswrapper[7648]: I0308 03:13:14.848434 7648 status_manager.go:851] "Failed to get status for pod" podUID="e71caa06-6ce7-47c9-a267-21f6b6af9247" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-7f65c457f5-8wv6c\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.848947 master-0 kubenswrapper[7648]: I0308 03:13:14.848683 7648 generic.go:334] "Generic (PLEG): container finished" podID="7ea81472-8a81-45ec-a07d-8710f47a927d" containerID="d57e1157c7569d934ea76665ae63811243fb6a6eb902e18c216d3947853ca6e4" exitCode=0 Mar 08 03:13:14.848947 master-0 kubenswrapper[7648]: I0308 03:13:14.848730 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"7ea81472-8a81-45ec-a07d-8710f47a927d","Type":"ContainerDied","Data":"d57e1157c7569d934ea76665ae63811243fb6a6eb902e18c216d3947853ca6e4"} Mar 08 03:13:14.848947 master-0 kubenswrapper[7648]: I0308 03:13:14.848919 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.850053 master-0 kubenswrapper[7648]: I0308 03:13:14.850015 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.850541 master-0 kubenswrapper[7648]: I0308 03:13:14.850508 7648 status_manager.go:851] "Failed to get status for pod" podUID="e71caa06-6ce7-47c9-a267-21f6b6af9247" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-7f65c457f5-8wv6c\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.850835 master-0 kubenswrapper[7648]: I0308 03:13:14.850804 7648 generic.go:334] "Generic (PLEG): container finished" podID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" containerID="ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f" exitCode=0 Mar 08 03:13:14.850906 master-0 kubenswrapper[7648]: I0308 03:13:14.850845 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerDied","Data":"ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f"} Mar 08 03:13:14.851165 master-0 kubenswrapper[7648]: I0308 03:13:14.851137 7648 scope.go:117] "RemoveContainer" containerID="ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f" Mar 08 03:13:14.856886 master-0 kubenswrapper[7648]: I0308 03:13:14.856623 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.857671 master-0 kubenswrapper[7648]: I0308 03:13:14.857399 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.858417 master-0 kubenswrapper[7648]: I0308 03:13:14.858117 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.858417 master-0 kubenswrapper[7648]: I0308 03:13:14.858309 7648 generic.go:334] "Generic (PLEG): container finished" podID="6432d23b-a55a-4131-83d5-5f16419809dd" containerID="d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a" exitCode=0 Mar 08 03:13:14.858417 master-0 kubenswrapper[7648]: I0308 03:13:14.858373 7648 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerDied","Data":"d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a"} Mar 08 03:13:14.859025 master-0 kubenswrapper[7648]: I0308 03:13:14.858836 7648 status_manager.go:851] "Failed to get status for pod" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-7c6989d6c4-zqlnx\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.859025 master-0 kubenswrapper[7648]: I0308 03:13:14.858853 7648 scope.go:117] "RemoveContainer" containerID="d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a" Mar 08 03:13:14.859353 master-0 kubenswrapper[7648]: I0308 03:13:14.859280 7648 status_manager.go:851] "Failed to get status for pod" podUID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-service-ca-operator/pods/service-ca-operator-69b6fc6b88-57b4v\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.859935 master-0 kubenswrapper[7648]: I0308 03:13:14.859877 7648 status_manager.go:851] "Failed to get status for pod" podUID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5c74bfc494-hw2kt\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.860563 master-0 kubenswrapper[7648]: I0308 03:13:14.860534 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.861050 master-0 kubenswrapper[7648]: I0308 03:13:14.861015 7648 status_manager.go:851] "Failed to get status for pod" podUID="7ea81472-8a81-45ec-a07d-8710f47a927d" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.861520 master-0 kubenswrapper[7648]: I0308 03:13:14.861476 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.861898 master-0 kubenswrapper[7648]: I0308 03:13:14.861861 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.862389 master-0 kubenswrapper[7648]: I0308 03:13:14.862354 7648 status_manager.go:851] "Failed to get status for pod" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-7c6989d6c4-zqlnx\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.862988 master-0 kubenswrapper[7648]: I0308 03:13:14.862939 7648 status_manager.go:851] "Failed to get status for pod" podUID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-service-ca-operator/pods/service-ca-operator-69b6fc6b88-57b4v\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.863464 master-0 kubenswrapper[7648]: I0308 03:13:14.863401 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.863867 master-0 kubenswrapper[7648]: I0308 03:13:14.863833 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.864384 master-0 kubenswrapper[7648]: I0308 03:13:14.864348 7648 status_manager.go:851] "Failed to get status for pod" podUID="6432d23b-a55a-4131-83d5-5f16419809dd" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-799b6db4d7-krqcr\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:14.864797 master-0 kubenswrapper[7648]: I0308 03:13:14.864755 7648 status_manager.go:851] "Failed to get status for pod" podUID="e71caa06-6ce7-47c9-a267-21f6b6af9247" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-7f65c457f5-8wv6c\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.178440 master-0 kubenswrapper[7648]: E0308 03:13:17.178252 7648 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 08 03:13:17.178440 master-0 kubenswrapper[7648]: &Event{ObjectMeta:{authentication-operator-7c6989d6c4-zqlnx.189abf2a5a85b4d0 openshift-authentication-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication-operator,Name:authentication-operator-7c6989d6c4-zqlnx,UID:f08a644f-3b61-46a7-a7b6-a9f7f2f7d266,APIVersion:v1,ResourceVersion:3600,FieldPath:spec.containers{authentication-operator},},Reason:ProbeError,Message:Liveness probe error: Get "https://10.128.0.5:8443/healthz": dial tcp 10.128.0.5:8443: connect: connection refused Mar 08 03:13:17.178440 master-0 kubenswrapper[7648]: body: Mar 08 03:13:17.178440 master-0 kubenswrapper[7648]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:11.951955152 +0000 UTC m=+84.563273452,LastTimestamp:2026-03-08 03:13:11.951955152 +0000 UTC m=+84.563273452,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 08 03:13:17.178440 master-0 kubenswrapper[7648]: > Mar 08 03:13:17.617968 master-0 kubenswrapper[7648]: I0308 03:13:17.617338 7648 status_manager.go:851] "Failed to get status for pod" podUID="6432d23b-a55a-4131-83d5-5f16419809dd" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-799b6db4d7-krqcr\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.619944 master-0 kubenswrapper[7648]: I0308 03:13:17.619897 7648 status_manager.go:851] "Failed to get status for pod" podUID="e71caa06-6ce7-47c9-a267-21f6b6af9247" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-7f65c457f5-8wv6c\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.620668 master-0 kubenswrapper[7648]: I0308 03:13:17.620632 7648 status_manager.go:851] "Failed to get status for pod" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver-operator/pods/kube-apiserver-operator-68bd585b-hg2f6\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.621496 master-0 kubenswrapper[7648]: I0308 03:13:17.621412 7648 status_manager.go:851] "Failed to get status for pod" podUID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5c74bfc494-hw2kt\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.622505 master-0 kubenswrapper[7648]: I0308 03:13:17.622404 7648 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.623968 master-0 kubenswrapper[7648]: I0308 03:13:17.623913 7648 status_manager.go:851] "Failed to get status for pod" podUID="7ea81472-8a81-45ec-a07d-8710f47a927d" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.625244 master-0 kubenswrapper[7648]: I0308 03:13:17.625193 7648 status_manager.go:851] "Failed to get status for pod" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-olm-operator/pods/cluster-olm-operator-77899cf6d-h4ldq\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.626424 master-0 kubenswrapper[7648]: I0308 03:13:17.626360 7648 status_manager.go:851] "Failed to get status for pod" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-7c6989d6c4-zqlnx\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.627974 master-0 kubenswrapper[7648]: I0308 03:13:17.627905 7648 status_manager.go:851] "Failed to get status for pod" podUID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-service-ca-operator/pods/service-ca-operator-69b6fc6b88-57b4v\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.629020 master-0 kubenswrapper[7648]: I0308 03:13:17.628967 7648 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.630106 master-0 kubenswrapper[7648]: I0308 03:13:17.630026 7648 status_manager.go:851] "Failed to get status for pod" podUID="aadbbe97-2a03-40da-846d-252e29661f67" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-86d7cdfdfb-sjdgk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:13:17.789354 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 08 03:13:17.790644 master-0 kubenswrapper[7648]: I0308 03:13:17.789395 7648 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:13:17.822707 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 08 03:13:17.822965 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 08 03:13:17.824619 master-0 systemd[1]: kubelet.service: Consumed 21.060s CPU time. Mar 08 03:13:17.848381 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 03:13:17.968803 master-0 kubenswrapper[13046]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 03:13:17.969428 master-0 kubenswrapper[13046]: I0308 03:13:17.968881 13046 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 03:13:17.971074 master-0 kubenswrapper[13046]: W0308 03:13:17.971049 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:13:17.971074 master-0 kubenswrapper[13046]: W0308 03:13:17.971067 13046 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:13:17.971074 master-0 kubenswrapper[13046]: W0308 03:13:17.971072 13046 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:13:17.971074 master-0 kubenswrapper[13046]: W0308 03:13:17.971077 13046 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:13:17.971074 master-0 kubenswrapper[13046]: W0308 03:13:17.971082 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971087 13046 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971091 13046 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971095 13046 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971098 13046 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971102 13046 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971106 13046 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971109 13046 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971113 13046 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971117 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971122 13046 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971126 13046 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971130 13046 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971135 13046 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971140 13046 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971145 13046 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971148 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971152 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971156 13046 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:13:17.971304 master-0 kubenswrapper[13046]: W0308 03:13:17.971159 13046 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971164 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971167 13046 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971171 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971174 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971178 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971181 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971186 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971189 13046 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971193 13046 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971196 13046 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971200 13046 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971203 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971208 13046 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971212 13046 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971216 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971220 13046 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971223 13046 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971227 13046 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971230 13046 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:13:17.971911 master-0 kubenswrapper[13046]: W0308 03:13:17.971233 13046 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971237 13046 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971241 13046 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971244 13046 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971248 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971251 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971255 13046 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971264 13046 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971268 13046 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971271 13046 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971275 13046 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971278 13046 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971282 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971285 13046 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971291 13046 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971295 13046 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971298 13046 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971308 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971313 13046 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971317 13046 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971321 13046 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:13:17.972540 master-0 kubenswrapper[13046]: W0308 03:13:17.971325 13046 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971330 13046 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971337 13046 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971342 13046 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971347 13046 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971352 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971355 13046 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: W0308 03:13:17.971359 13046 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971434 13046 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971443 13046 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971449 13046 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971454 13046 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971459 13046 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971464 13046 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971469 13046 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971474 13046 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971493 13046 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971497 13046 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971503 13046 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971507 13046 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971511 13046 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 03:13:17.973220 master-0 kubenswrapper[13046]: I0308 03:13:17.971515 13046 flags.go:64] FLAG: --cgroup-root="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971519 13046 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971523 13046 flags.go:64] FLAG: --client-ca-file="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971527 13046 flags.go:64] FLAG: --cloud-config="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971531 13046 flags.go:64] FLAG: --cloud-provider="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971535 13046 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971543 13046 flags.go:64] FLAG: --cluster-domain="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971548 13046 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971552 13046 flags.go:64] FLAG: --config-dir="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971556 13046 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971560 13046 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971570 13046 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971575 13046 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971580 13046 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971584 13046 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971588 13046 flags.go:64] FLAG: --contention-profiling="false" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971592 13046 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971596 13046 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971601 13046 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971605 13046 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971610 13046 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971614 13046 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971618 13046 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971622 13046 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971626 13046 flags.go:64] FLAG: --enable-server="true" Mar 08 03:13:17.974358 master-0 kubenswrapper[13046]: I0308 03:13:17.971630 13046 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971637 13046 flags.go:64] FLAG: --event-burst="100" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971641 13046 flags.go:64] FLAG: --event-qps="50" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971645 13046 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971649 13046 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971653 13046 flags.go:64] FLAG: --eviction-hard="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971658 13046 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971662 13046 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971666 13046 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971674 13046 flags.go:64] FLAG: --eviction-soft="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971678 13046 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971682 13046 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971686 13046 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971690 13046 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971694 13046 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971698 13046 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971702 13046 flags.go:64] FLAG: --feature-gates="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971707 13046 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971711 13046 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971716 13046 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971720 13046 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971724 13046 flags.go:64] FLAG: --healthz-port="10248" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971733 13046 flags.go:64] FLAG: --help="false" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971738 13046 flags.go:64] FLAG: --hostname-override="" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971742 13046 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971746 13046 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 03:13:17.975009 master-0 kubenswrapper[13046]: I0308 03:13:17.971750 13046 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971754 13046 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971759 13046 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971763 13046 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971767 13046 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971771 13046 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971775 13046 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971779 13046 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971783 13046 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971787 13046 flags.go:64] FLAG: --kube-reserved="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971791 13046 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971795 13046 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971799 13046 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971803 13046 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971807 13046 flags.go:64] FLAG: --lock-file="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971815 13046 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971819 13046 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971823 13046 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971829 13046 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971833 13046 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971837 13046 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971841 13046 flags.go:64] FLAG: --logging-format="text" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971845 13046 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971849 13046 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971853 13046 flags.go:64] FLAG: --manifest-url="" Mar 08 03:13:17.975691 master-0 kubenswrapper[13046]: I0308 03:13:17.971857 13046 flags.go:64] FLAG: --manifest-url-header="" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971863 13046 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971868 13046 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971873 13046 flags.go:64] FLAG: --max-pods="110" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971877 13046 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971881 13046 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971885 13046 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971895 13046 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971900 13046 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971904 13046 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971908 13046 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971917 13046 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971922 13046 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971926 13046 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971930 13046 flags.go:64] FLAG: --pod-cidr="" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971935 13046 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971943 13046 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971948 13046 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971952 13046 flags.go:64] FLAG: --pods-per-core="0" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971956 13046 flags.go:64] FLAG: --port="10250" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971960 13046 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971964 13046 flags.go:64] FLAG: --provider-id="" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971978 13046 flags.go:64] FLAG: --qos-reserved="" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971982 13046 flags.go:64] FLAG: --read-only-port="10255" Mar 08 03:13:17.976294 master-0 kubenswrapper[13046]: I0308 03:13:17.971986 13046 flags.go:64] FLAG: --register-node="true" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.971990 13046 flags.go:64] FLAG: --register-schedulable="true" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.971994 13046 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972001 13046 flags.go:64] FLAG: --registry-burst="10" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972005 13046 flags.go:64] FLAG: --registry-qps="5" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972015 13046 flags.go:64] FLAG: --reserved-cpus="" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972019 13046 flags.go:64] FLAG: --reserved-memory="" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972024 13046 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972028 13046 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972032 13046 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972036 13046 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972040 13046 flags.go:64] FLAG: --runonce="false" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972044 13046 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972048 13046 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972052 13046 flags.go:64] FLAG: --seccomp-default="false" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972056 13046 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972060 13046 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972064 13046 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972068 13046 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972077 13046 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972081 13046 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972085 13046 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972089 13046 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972093 13046 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972097 13046 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 03:13:17.976955 master-0 kubenswrapper[13046]: I0308 03:13:17.972101 13046 flags.go:64] FLAG: --system-cgroups="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972105 13046 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972111 13046 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972115 13046 flags.go:64] FLAG: --tls-cert-file="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972119 13046 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972130 13046 flags.go:64] FLAG: --tls-min-version="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972134 13046 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972138 13046 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972142 13046 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972146 13046 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972150 13046 flags.go:64] FLAG: --v="2" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972155 13046 flags.go:64] FLAG: --version="false" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972163 13046 flags.go:64] FLAG: --vmodule="" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972167 13046 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: I0308 03:13:17.972171 13046 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972286 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972291 13046 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972295 13046 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972299 13046 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972303 13046 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972307 13046 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972311 13046 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972315 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:13:17.978283 master-0 kubenswrapper[13046]: W0308 03:13:17.972319 13046 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972323 13046 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972327 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972331 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972335 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972340 13046 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972343 13046 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972352 13046 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972356 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972359 13046 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972363 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972367 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972371 13046 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972375 13046 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972382 13046 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972386 13046 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972390 13046 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972394 13046 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972399 13046 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:13:17.979186 master-0 kubenswrapper[13046]: W0308 03:13:17.972403 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972407 13046 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972414 13046 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972417 13046 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972421 13046 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972424 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972428 13046 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972431 13046 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972435 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972438 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972442 13046 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972445 13046 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972450 13046 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972454 13046 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972458 13046 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972462 13046 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972466 13046 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972470 13046 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972473 13046 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972496 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:13:17.979804 master-0 kubenswrapper[13046]: W0308 03:13:17.972500 13046 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972503 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972507 13046 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972510 13046 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972519 13046 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972523 13046 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972526 13046 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972533 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972536 13046 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972540 13046 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972544 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972548 13046 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972552 13046 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972556 13046 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972563 13046 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972568 13046 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972572 13046 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972576 13046 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972580 13046 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972583 13046 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:13:17.980426 master-0 kubenswrapper[13046]: W0308 03:13:17.972587 13046 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:13:17.981091 master-0 kubenswrapper[13046]: W0308 03:13:17.972590 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:13:17.981091 master-0 kubenswrapper[13046]: W0308 03:13:17.972594 13046 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:13:17.981091 master-0 kubenswrapper[13046]: W0308 03:13:17.972598 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:13:17.981091 master-0 kubenswrapper[13046]: W0308 03:13:17.972601 13046 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:13:17.981091 master-0 kubenswrapper[13046]: I0308 03:13:17.972607 13046 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:13:17.985100 master-0 kubenswrapper[13046]: I0308 03:13:17.985062 13046 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 08 03:13:17.985100 master-0 kubenswrapper[13046]: I0308 03:13:17.985092 13046 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 03:13:17.986514 master-0 kubenswrapper[13046]: W0308 03:13:17.986427 13046 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:13:17.987577 master-0 kubenswrapper[13046]: W0308 03:13:17.986876 13046 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:13:17.987911 master-0 kubenswrapper[13046]: W0308 03:13:17.987871 13046 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:13:17.987911 master-0 kubenswrapper[13046]: W0308 03:13:17.987895 13046 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:13:17.987911 master-0 kubenswrapper[13046]: W0308 03:13:17.987903 13046 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:13:17.988050 master-0 kubenswrapper[13046]: W0308 03:13:17.987909 13046 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:13:17.988050 master-0 kubenswrapper[13046]: W0308 03:13:17.988042 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:13:17.988128 master-0 kubenswrapper[13046]: W0308 03:13:17.988053 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:13:17.988128 master-0 kubenswrapper[13046]: W0308 03:13:17.988064 13046 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:13:17.988128 master-0 kubenswrapper[13046]: W0308 03:13:17.988072 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:13:17.988128 master-0 kubenswrapper[13046]: W0308 03:13:17.988080 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:13:17.988128 master-0 kubenswrapper[13046]: W0308 03:13:17.988087 13046 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988094 13046 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988148 13046 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988205 13046 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988218 13046 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988224 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988230 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988236 13046 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988243 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988253 13046 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988261 13046 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988267 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988273 13046 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988299 13046 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988306 13046 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988313 13046 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:13:17.988297 master-0 kubenswrapper[13046]: W0308 03:13:17.988320 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988327 13046 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988333 13046 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988359 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988367 13046 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988381 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988387 13046 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988394 13046 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988399 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988405 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988410 13046 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988416 13046 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988421 13046 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988427 13046 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988433 13046 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988438 13046 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988493 13046 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988500 13046 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988527 13046 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988533 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:13:17.988837 master-0 kubenswrapper[13046]: W0308 03:13:17.988538 13046 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988543 13046 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988549 13046 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988554 13046 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988560 13046 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988587 13046 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988595 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988600 13046 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988606 13046 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988611 13046 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988619 13046 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988625 13046 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988631 13046 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988638 13046 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988643 13046 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988649 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988655 13046 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988660 13046 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988667 13046 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:13:17.989743 master-0 kubenswrapper[13046]: W0308 03:13:17.988675 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.988682 13046 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.988687 13046 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.988693 13046 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.988699 13046 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.988704 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: I0308 03:13:17.988715 13046 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989155 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989166 13046 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989173 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989178 13046 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989184 13046 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989189 13046 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989195 13046 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989200 13046 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 03:13:17.990308 master-0 kubenswrapper[13046]: W0308 03:13:17.989206 13046 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989211 13046 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989216 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989222 13046 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989227 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989233 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989238 13046 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989243 13046 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989249 13046 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989256 13046 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989261 13046 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989267 13046 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989276 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989281 13046 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989286 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989292 13046 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989297 13046 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989302 13046 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989308 13046 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989335 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 03:13:17.990906 master-0 kubenswrapper[13046]: W0308 03:13:17.989342 13046 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989348 13046 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989353 13046 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989358 13046 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989364 13046 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989369 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989375 13046 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989380 13046 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989386 13046 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989391 13046 feature_gate.go:330] unrecognized feature gate: Example Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989398 13046 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989404 13046 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989409 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989414 13046 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989420 13046 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989425 13046 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989603 13046 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989615 13046 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989620 13046 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989626 13046 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 03:13:17.992127 master-0 kubenswrapper[13046]: W0308 03:13:17.989632 13046 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989639 13046 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989645 13046 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989653 13046 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989660 13046 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989667 13046 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989673 13046 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989788 13046 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989797 13046 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989804 13046 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989810 13046 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989816 13046 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989822 13046 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989829 13046 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989834 13046 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989840 13046 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989846 13046 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989852 13046 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989857 13046 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 03:13:17.993174 master-0 kubenswrapper[13046]: W0308 03:13:17.989863 13046 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: W0308 03:13:17.989869 13046 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: W0308 03:13:17.989874 13046 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: W0308 03:13:17.989879 13046 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: W0308 03:13:17.989885 13046 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: I0308 03:13:17.989895 13046 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 03:13:17.993889 master-0 kubenswrapper[13046]: I0308 03:13:17.990183 13046 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 03:13:17.994323 master-0 kubenswrapper[13046]: I0308 03:13:17.994196 13046 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 08 03:13:17.994386 master-0 kubenswrapper[13046]: I0308 03:13:17.994370 13046 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 03:13:17.994819 master-0 kubenswrapper[13046]: I0308 03:13:17.994789 13046 server.go:997] "Starting client certificate rotation" Mar 08 03:13:17.994819 master-0 kubenswrapper[13046]: I0308 03:13:17.994811 13046 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 03:13:17.995064 master-0 kubenswrapper[13046]: I0308 03:13:17.994962 13046 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 21:25:07.679094065 +0000 UTC Mar 08 03:13:17.995064 master-0 kubenswrapper[13046]: I0308 03:13:17.995054 13046 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h11m49.684044014s for next certificate rotation Mar 08 03:13:17.996312 master-0 kubenswrapper[13046]: I0308 03:13:17.996226 13046 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:13:17.998016 master-0 kubenswrapper[13046]: I0308 03:13:17.997967 13046 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:13:18.000456 master-0 kubenswrapper[13046]: I0308 03:13:18.000385 13046 log.go:25] "Validated CRI v1 runtime API" Mar 08 03:13:18.005075 master-0 kubenswrapper[13046]: I0308 03:13:18.005048 13046 log.go:25] "Validated CRI v1 image API" Mar 08 03:13:18.006540 master-0 kubenswrapper[13046]: I0308 03:13:18.006520 13046 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 03:13:18.015408 master-0 kubenswrapper[13046]: I0308 03:13:18.015374 13046 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 87790f63-c01f-464b-b8aa-2380aaf22629:/dev/vda3 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 08 03:13:18.016800 master-0 kubenswrapper[13046]: I0308 03:13:18.015476 13046 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796/userdata/shm major:0 minor:572 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/04168952ada741f79304ee9b25e1212567fc1ce3d719a0050a26b711accbbea4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/04168952ada741f79304ee9b25e1212567fc1ce3d719a0050a26b711accbbea4/userdata/shm major:0 minor:647 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9/userdata/shm major:0 minor:809 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18/userdata/shm major:0 minor:385 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09504f5af1d7c056fa184727bb790ba83f7a308b15d1c9ebf34076ea08bbf988/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09504f5af1d7c056fa184727bb790ba83f7a308b15d1c9ebf34076ea08bbf988/userdata/shm major:0 minor:116 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0d9abd668d0f5e4396724f5ea282e6dd1f64c0edb81c83294ae09a514ba683b4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0d9abd668d0f5e4396724f5ea282e6dd1f64c0edb81c83294ae09a514ba683b4/userdata/shm major:0 minor:652 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849/userdata/shm major:0 minor:431 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/11a698247fc6f9c54a51413289ea242dad54ebc3d1193e702f60f1dd98b867ee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/11a698247fc6f9c54a51413289ea242dad54ebc3d1193e702f60f1dd98b867ee/userdata/shm major:0 minor:783 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm major:0 minor:237 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/232775f5a2d5493c0a82abf166454589f3f2855c9d7aba021d33f9d3267ef323/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/232775f5a2d5493c0a82abf166454589f3f2855c9d7aba021d33f9d3267ef323/userdata/shm major:0 minor:727 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm major:0 minor:274 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c/userdata/shm major:0 minor:607 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/33074446cdbc88121d8e124ce9ef4086de417c43c65ed95b23d3c327d6455998/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/33074446cdbc88121d8e124ce9ef4086de417c43c65ed95b23d3c327d6455998/userdata/shm major:0 minor:910 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30/userdata/shm major:0 minor:601 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3b71760d7242393a221c7ae1af331545931cbaec81501f72daac6ae1c2882487/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3b71760d7242393a221c7ae1af331545931cbaec81501f72daac6ae1c2882487/userdata/shm major:0 minor:648 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3b767f72fbe851c0148683712fba4f0872103808c8eb0533886fa5261badacc5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3b767f72fbe851c0148683712fba4f0872103808c8eb0533886fa5261badacc5/userdata/shm major:0 minor:495 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/40b87ce5bc138e32a2067ed918b783e37e64b1d585f2b0c8e8982345833631fd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/40b87ce5bc138e32a2067ed918b783e37e64b1d585f2b0c8e8982345833631fd/userdata/shm major:0 minor:491 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5084fee9f78fd3409fb341ad2acde32d4d944cb2a228b141d353e9f022872f48/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5084fee9f78fd3409fb341ad2acde32d4d944cb2a228b141d353e9f022872f48/userdata/shm major:0 minor:845 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25/userdata/shm major:0 minor:468 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a6e3c3e5ef7bc2875a4438317596870f92270cd7e8853931713a647f7c41386/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a6e3c3e5ef7bc2875a4438317596870f92270cd7e8853931713a647f7c41386/userdata/shm major:0 minor:467 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5e30ced5c1465fa4b9f72a89783db5d665983a50641b79b25a38f7c94e44add4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5e30ced5c1465fa4b9f72a89783db5d665983a50641b79b25a38f7c94e44add4/userdata/shm major:0 minor:463 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72/userdata/shm major:0 minor:839 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm major:0 minor:239 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/702795b7a3b9492f17a3552f3377a1320bf2ba8da965c8533a8f5f8dc47e6545/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/702795b7a3b9492f17a3552f3377a1320bf2ba8da965c8533a8f5f8dc47e6545/userdata/shm major:0 minor:756 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724/userdata/shm major:0 minor:584 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed/userdata/shm major:0 minor:794 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ca736d5c0f4dfec580e9c43c992c5b2d64a84418a876f71fc7a9325b6f9c563/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ca736d5c0f4dfec580e9c43c992c5b2d64a84418a876f71fc7a9325b6f9c563/userdata/shm major:0 minor:788 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm major:0 minor:146 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a/userdata/shm major:0 minor:465 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd/userdata/shm major:0 minor:771 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f/userdata/shm major:0 minor:598 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm major:0 minor:247 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c380d8376b93cf0d471da9a093b8dab4577d756ac31e0b75746f35b913cbd11/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c380d8376b93cf0d471da9a093b8dab4577d756ac31e0b75746f35b913cbd11/userdata/shm major:0 minor:881 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a153dffc082c8d8e34a6c6e6c0c21f4bb223cf1b6ae19843ae82a4a21f8d697f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a153dffc082c8d8e34a6c6e6c0c21f4bb223cf1b6ae19843ae82a4a21f8d697f/userdata/shm major:0 minor:829 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176/userdata/shm major:0 minor:811 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2/userdata/shm major:0 minor:490 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c/userdata/shm major:0 minor:610 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141/userdata/shm major:0 minor:586 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08/userdata/shm major:0 minor:418 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ac8609fedee0058569da7d8a957a2d1f873b1e97869f7515236d24de03c1a1d3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ac8609fedee0058569da7d8a957a2d1f873b1e97869f7515236d24de03c1a1d3/userdata/shm major:0 minor:309 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6/userdata/shm major:0 minor:642 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4547ec615a98dfc1a4d3f423cf139e6774712d10eb4a01d3d753a13dcc2d3fd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4547ec615a98dfc1a4d3f423cf139e6774712d10eb4a01d3d753a13dcc2d3fd/userdata/shm major:0 minor:790 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf0f6570fe84a3058c6d6122c2d052c6c8b6d42a5f14c4cbfb5452cbc6866dd1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf0f6570fe84a3058c6d6122c2d052c6c8b6d42a5f14c4cbfb5452cbc6866dd1/userdata/shm major:0 minor:877 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cb298ff85bc6afefe78d9670cec4232d77064bf8eb867d648f99dcfde97ded03/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cb298ff85bc6afefe78d9670cec4232d77064bf8eb867d648f99dcfde97ded03/userdata/shm major:0 minor:805 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cb95d850788ef393eb0f1ea09b2f5ec3ff3892998a30374932e76aea89c669e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cb95d850788ef393eb0f1ea09b2f5ec3ff3892998a30374932e76aea89c669e6/userdata/shm major:0 minor:653 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cc2c80df51f36394dc9dae6d283de990f51bae7d557e41ccd7171b38db33c170/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cc2c80df51f36394dc9dae6d283de990f51bae7d557e41ccd7171b38db33c170/userdata/shm major:0 minor:117 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100/userdata/shm major:0 minor:650 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996/userdata/shm major:0 minor:381 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e023598af62616102fc3da25dddc7bed12c4ad58ecf15ebabad27596e663a5e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e023598af62616102fc3da25dddc7bed12c4ad58ecf15ebabad27596e663a5e7/userdata/shm major:0 minor:645 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63/userdata/shm major:0 minor:434 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e9fad8077e3e386a60b5dc7bb7e5c6bd154a1e1fbbbc76393683792778d9fac5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e9fad8077e3e386a60b5dc7bb7e5c6bd154a1e1fbbbc76393683792778d9fac5/userdata/shm major:0 minor:644 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f330f5d17188d70a58eecf3d0a2330b70f4408aee114d8d6465e47081bd71e07/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f330f5d17188d70a58eecf3d0a2330b70f4408aee114d8d6465e47081bd71e07/userdata/shm major:0 minor:374 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0781e6af-f5b5-40f7-bb7f-5bc6978b4957/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/0781e6af-f5b5-40f7-bb7f-5bc6978b4957/volumes/kubernetes.io~projected/kube-api-access major:0 minor:838 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~projected/kube-api-access-m97fm:{mountpoint:/var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~projected/kube-api-access-m97fm major:0 minor:787 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~projected/kube-api-access-llwh7:{mountpoint:/var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~projected/kube-api-access-llwh7 major:0 minor:922 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~secret/proxy-tls major:0 minor:921 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~projected/kube-api-access-pb2xh:{mountpoint:/var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~projected/kube-api-access-pb2xh major:0 minor:780 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~secret/cert major:0 minor:786 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a6e3f01-0f22-4961-b450-56aca5477943/volumes/kubernetes.io~projected/kube-api-access-qj8dt:{mountpoint:/var/lib/kubelet/pods/1a6e3f01-0f22-4961-b450-56aca5477943/volumes/kubernetes.io~projected/kube-api-access-qj8dt major:0 minor:779 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~projected/kube-api-access-6bcd7:{mountpoint:/var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~projected/kube-api-access-6bcd7 major:0 minor:828 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:827 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc:{mountpoint:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~secret/webhook-certs major:0 minor:640 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq:{mountpoint:/var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t:{mountpoint:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:637 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2dc664e3-7f37-4fba-8104-544ffb18c1bd/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/2dc664e3-7f37-4fba-8104-544ffb18c1bd/volumes/kubernetes.io~projected/kube-api-access major:0 minor:602 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/306b824f-dcfb-4e69-9a23-64dfbae61852/volumes/kubernetes.io~projected/kube-api-access-4pxwl:{mountpoint:/var/lib/kubelet/pods/306b824f-dcfb-4e69-9a23-64dfbae61852/volumes/kubernetes.io~projected/kube-api-access-4pxwl major:0 minor:378 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~projected/kube-api-access-qt69c:{mountpoint:/var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~projected/kube-api-access-qt69c major:0 minor:778 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:777 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt:{mountpoint:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x:{mountpoint:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:638 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll:{mountpoint:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~secret/srv-cert major:0 minor:636 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50ab8f71-42b8-4967-8a0b-016647c59a37/volumes/kubernetes.io~projected/kube-api-access-h9jsw:{mountpoint:/var/lib/kubelet/pods/50ab8f71-42b8-4967-8a0b-016647c59a37/volumes/kubernetes.io~projected/kube-api-access-h9jsw major:0 minor:562 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~projected/kube-api-access-mdxtt:{mountpoint:/var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~projected/kube-api-access-mdxtt major:0 minor:114 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/ca-certs major:0 minor:485 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/kube-api-access-dskxf:{mountpoint:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/kube-api-access-dskxf major:0 minor:486 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:350 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/volumes/kubernetes.io~projected/kube-api-access-rzmjd:{mountpoint:/var/lib/kubelet/pods/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/volumes/kubernetes.io~projected/kube-api-access-rzmjd major:0 minor:373 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~projected/kube-api-access-84q5n:{mountpoint:/var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~projected/kube-api-access-84q5n major:0 minor:563 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:560 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz:{mountpoint:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cert major:0 minor:458 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:460 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m:{mountpoint:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68309159-130a-4ffa-acec-95dc4b795b8f/volumes/kubernetes.io~projected/kube-api-access-k58bm:{mountpoint:/var/lib/kubelet/pods/68309159-130a-4ffa-acec-95dc4b795b8f/volumes/kubernetes.io~projected/kube-api-access-k58bm major:0 minor:571 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a9d0240-fc00-4d78-9458-8f53b1876f1b/volumes/kubernetes.io~projected/kube-api-access-b2vvq:{mountpoint:/var/lib/kubelet/pods/6a9d0240-fc00-4d78-9458-8f53b1876f1b/volumes/kubernetes.io~projected/kube-api-access-b2vvq major:0 minor:804 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~projected/kube-api-access-h8fg7:{mountpoint:/var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~projected/kube-api-access-h8fg7 major:0 minor:920 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~secret/serving-cert major:0 minor:919 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l:{mountpoint:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l major:0 minor:269 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr:{mountpoint:/var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl:{mountpoint:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~secret/metrics-certs major:0 minor:641 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp:{mountpoint:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e324f6c-ee4c-42bc-b241-9c6938749854/volumes/kubernetes.io~projected/kube-api-access-qgw92:{mountpoint:/var/lib/kubelet/pods/7e324f6c-ee4c-42bc-b241-9c6938749854/volumes/kubernetes.io~projected/kube-api-access-qgw92 major:0 minor:592 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7ea81472-8a81-45ec-a07d-8710f47a927d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/7ea81472-8a81-45ec-a07d-8710f47a927d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:694 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/ca-certs major:0 minor:483 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/kube-api-access-4nwgh:{mountpoint:/var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/kube-api-access-4nwgh major:0 minor:484 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9:{mountpoint:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~secret/metrics-tls major:0 minor:462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx:{mountpoint:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z:{mountpoint:/var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~projected/kube-api-access-wk7jv:{mountpoint:/var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~projected/kube-api-access-wk7jv major:0 minor:782 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~secret/serving-cert major:0 minor:773 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9cf6ce1a-c203-4033-86be-be16694a9062/volumes/kubernetes.io~projected/kube-api-access-ml7t9:{mountpoint:/var/lib/kubelet/pods/9cf6ce1a-c203-4033-86be-be16694a9062/volumes/kubernetes.io~projected/kube-api-access-ml7t9 major:0 minor:561 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt:{mountpoint:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa781f72-e72f-47e1-b37a-977340c182c8/volumes/kubernetes.io~projected/kube-api-access-b25w4:{mountpoint:/var/lib/kubelet/pods/aa781f72-e72f-47e1-b37a-977340c182c8/volumes/kubernetes.io~projected/kube-api-access-b25w4 major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/acb74744-fb99-4663-a7d0-7bae2db205e9/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/acb74744-fb99-4663-a7d0-7bae2db205e9/volumes/kubernetes.io~projected/kube-api-access major:0 minor:600 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~projected/kube-api-access major:0 minor:722 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~secret/serving-cert major:0 minor:713 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b05d5093-20f4-42d5-9db3-811e049cc1b6/volumes/kubernetes.io~projected/kube-api-access-ddxbs:{mountpoint:/var/lib/kubelet/pods/b05d5093-20f4-42d5-9db3-811e049cc1b6/volumes/kubernetes.io~projected/kube-api-access-ddxbs major:0 minor:755 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~projected/kube-api-access-fvzs9:{mountpoint:/var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~projected/kube-api-access-fvzs9 major:0 minor:765 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~secret/serving-cert major:0 minor:762 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56:{mountpoint:/var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g:{mountpoint:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:409 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:414 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~projected/kube-api-access-885mp:{mountpoint:/var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~projected/kube-api-access-885mp major:0 minor:582 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~secret/metrics-tls major:0 minor:581 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p:{mountpoint:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~projected/kube-api-access-xpd47:{mountpoint:/var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~projected/kube-api-access-xpd47 major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~secret/webhook-certs major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~projected/kube-api-access-s27xv:{mountpoint:/var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~projected/kube-api-access-s27xv major:0 minor:901 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:900 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~projected/kube-api-access-6nhh9:{mountpoint:/var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~projected/kube-api-access-6nhh9 major:0 minor:380 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~secret/signing-key major:0 minor:379 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~projected/kube-api-access-6bpwx:{mountpoint:/var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~projected/kube-api-access-6bpwx major:0 minor:844 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:843 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9 major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2:{mountpoint:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2 major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernete Mar 08 03:13:18.017184 master-0 kubenswrapper[13046]: s.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj:{mountpoint:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~secret/srv-cert major:0 minor:628 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~projected/kube-api-access-ntd2k:{mountpoint:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~projected/kube-api-access-ntd2k major:0 minor:639 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/encryption-config major:0 minor:635 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/etcd-client major:0 minor:633 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/serving-cert major:0 minor:634 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:442 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/tmp major:0 minor:441 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~projected/kube-api-access-lh2rs:{mountpoint:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~projected/kube-api-access-lh2rs major:0 minor:445 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfe0357f-dab4-4424-869c-f6070b411a35/volumes/kubernetes.io~projected/kube-api-access-w5xsp:{mountpoint:/var/lib/kubelet/pods/dfe0357f-dab4-4424-869c-f6070b411a35/volumes/kubernetes.io~projected/kube-api-access-w5xsp major:0 minor:583 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk:{mountpoint:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd:{mountpoint:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:454 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e81d3c37-e8d7-44c8-973e-13992380ce85/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e81d3c37-e8d7-44c8-973e-13992380ce85/volumes/kubernetes.io~projected/kube-api-access major:0 minor:691 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk:{mountpoint:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:632 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd:{mountpoint:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~projected/kube-api-access-556dx:{mountpoint:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~projected/kube-api-access-556dx major:0 minor:488 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/encryption-config major:0 minor:440 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/etcd-client major:0 minor:487 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/serving-cert major:0 minor:439 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b:{mountpoint:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b major:0 minor:271 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~secret/metrics-tls major:0 minor:459 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~projected/kube-api-access-rtt8w:{mountpoint:/var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~projected/kube-api-access-rtt8w major:0 minor:871 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~secret/proxy-tls major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~projected/kube-api-access-982r4:{mountpoint:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~projected/kube-api-access-982r4 major:0 minor:768 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:767 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/webhook-cert major:0 minor:766 fsType:tmpfs blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/40f9c2974ba036bb89940573af144b4586eefd5a93302f118f7815601d3c098e/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/0ec0fd29f070a982a3dfdc9df1c869c587f3154e6892e7b827688bd9a2bd32fe/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/62cbb1f9d29467e71689d3addcba69f073875fef18626a41132714c1a632b40a/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/9eaa7f012506103b2e53795b45cf0bd9ca70f21270befca025803fd7739ce195/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/ab995380e3629db2698087e0aa28a4c5d6aba1208e4eb7c35ea1e0b10ec5f7a1/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/0db3042d052a37916d37142d030b052b85b4b48a2bb22499ae93770cb309dc12/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/f43709de24dee67b88fa62b9f9bc8ec4406cc244e1aa7b4ad1c196f255d7140b/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/e79b7218e6d5c34d1973fbfa72f1dc2fbd3e3374ea434a75590521f4935d2531/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/370cde7eafac098b6a9275153bc67d59fc782c0ca1d678379d115b2992ca7926/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/34f25d2541ccbc258036b532c478586d49916d7689e754bfb2036f77cdb6baaf/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/b4fba388b6f295d3fe277edfb579ced229f3f14e9563179e7c1d33f118008bf6/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/e872d2b18ddd96ff2f6342448cc9a15aef1b8ff55484f89addee75734f2e1c78/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/2233d9fe74718ab0ae53f5a512c7e547f9d981a9d9114a9700ec31c0933e09b7/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/57a9bfbbbe06e8741b7f71acad7d67f57133e3855842399a07580e12cdf86e5f/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-173:{mountpoint:/var/lib/containers/storage/overlay/937deb5ccbc384a4b4fcd24fdb4ab2a310e104b118773e5458288eb1635b4cc9/merged major:0 minor:173 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/0493015aad0c37f1cf949671a0cd243c66517f1b7801835d988f52a0b4ebaf1f/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/d943d212aa3c805f0ee86ceb2aa1bac6052e1648d2966b3847e65ee807bc9427/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/b186b114d84ebb5d81155ab25a48687fed4904a89bc31cdad5486e28d5195171/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/256c05ea6649d1c6f82ba731be228111eae7455dc21c3a922778e8c4bb9728b6/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/51fc7ff6a4fa287c151dd6eb373813b8ab6fc82de69197375ecd1e21813c9166/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/ce27efa5236d15ebaf2f519800b99cfccf04bb89dedaec7f192fc22145dc0145/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-267:{mountpoint:/var/lib/containers/storage/overlay/d4c637eca45faf4dc5ae119273f6ca19e0b1e41ff9586ceaad7b96e492c21fb1/merged major:0 minor:267 fsType:overlay blockSize:0} overlay_0-278:{mountpoint:/var/lib/containers/storage/overlay/0177399de96c59f0d082a83f3c99d8728cd24f4354c84606115f9c7e387fd0bb/merged major:0 minor:278 fsType:overlay blockSize:0} overlay_0-280:{mountpoint:/var/lib/containers/storage/overlay/dd1e4d86d4bb5590e6222601731cda85d2ab1a42246bb905777d795449ce11f2/merged major:0 minor:280 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/24e1c524b8242cfe6c6a73ec89989cbd20f616c0f238868827c96d17d61df07a/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-284:{mountpoint:/var/lib/containers/storage/overlay/aa3e99c07231324870aab48ddf6ac3aa5f979f3c188a21bf2cb09ebc807ce3c6/merged major:0 minor:284 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/a278a123654ccabfdecc16f667b23335fe34957bb6fa95ed755c6d6f2c25fc45/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/39925a702f39205a7d286aea1120e37a159eb7ea9547eb35c47107156a92beac/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/21bdcaa8c3c2b7bc82e63b75481269ac7050ce80c702009229ad24a08417b1de/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/3a80bde5f6d3bb761fc678109cf7e0321500b962b59e20991bea9b87c9343b7c/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-296:{mountpoint:/var/lib/containers/storage/overlay/c36afb1a7d3d04e0a4e3111b86b4e4a854597f2e1301c7de09cc2fcce3f87ac6/merged major:0 minor:296 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/e6230d35a24902da19a64d730c31efec5bf9088d8973d924e54531764da0630f/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/e344abc9ec0b8ea68650286cf106e964370084c9106df9e89c37ad2bd15e3630/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/72b3f01c53e382b2bce7d4b7d0197edc22431f3ef65a4864c8918ecb3047d239/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/35a809b75e50efe18aa41d88fff14caa5bdf6ddaf6af436a7329b2a45baf1cbc/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/703423a1e574c8eeb5fba2c9449eb9d18325d4766e3c02580b1de8b686d3f7a3/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/6add983787e3b84484de7b4f45a770c1c1b112189964b4c96e553efc04e5ac19/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-339:{mountpoint:/var/lib/containers/storage/overlay/b3a6f43079d695fc24cceb2d9949266ae06999983bb2f10895198749e22150eb/merged major:0 minor:339 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/035b0a5ab9e80b0b5de71a483c09454054ecd452db2ab3451d3774cb167a2d6c/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-376:{mountpoint:/var/lib/containers/storage/overlay/2e86e6dc09331acc3a6f1975fe34e3e02d83e22b6a292f5aa351213337a49ee3/merged major:0 minor:376 fsType:overlay blockSize:0} overlay_0-383:{mountpoint:/var/lib/containers/storage/overlay/9182de087cd98948366ec062f155fa891481e9388eb6a9f5734d23a978b6e3d8/merged major:0 minor:383 fsType:overlay blockSize:0} overlay_0-391:{mountpoint:/var/lib/containers/storage/overlay/2bd312218a4b2134ab0cbbb2584f5eaf7774f13dfee4dbf275e64a1fd393b0ba/merged major:0 minor:391 fsType:overlay blockSize:0} overlay_0-393:{mountpoint:/var/lib/containers/storage/overlay/15b1b415b1d1bffca416d4e9ea2c824e3f25761e44627212a0cb09a8d992720e/merged major:0 minor:393 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/b435bef36d42923f194e08df2e831f795842675141ecaa677262fa6854fe13dc/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-415:{mountpoint:/var/lib/containers/storage/overlay/31ce5892c1ba399d21804ebdaaddab81f0e3406d9b35cbc385cab04b806f89b4/merged major:0 minor:415 fsType:overlay blockSize:0} overlay_0-417:{mountpoint:/var/lib/containers/storage/overlay/b205d75eb7da0c455dbeaf4143b99035444f817cd619b2bd71d075d10ebada35/merged major:0 minor:417 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/942565253ab0001cf5039f87568434a0b607930b6816e507bae5f863cc9d0ac2/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-432:{mountpoint:/var/lib/containers/storage/overlay/527a7a9f4bbb1103a5d2999e6a4487f474dc2a3bb7fcffc027a288bd786e06a6/merged major:0 minor:432 fsType:overlay blockSize:0} overlay_0-443:{mountpoint:/var/lib/containers/storage/overlay/9db3bf9438941d4f69057e9d64461c7dbfbd8e35b6f85561489393b110fc17cb/merged major:0 minor:443 fsType:overlay blockSize:0} overlay_0-446:{mountpoint:/var/lib/containers/storage/overlay/08caaaf2d72613d6efb2f5f865c691a059371732ac24d05cc5f109e3a3712148/merged major:0 minor:446 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/4dda778b43077f90cbca8dc5ce5e9739abb8a7d41dca6a014696f40adaa1d07e/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/35db7fa1165e5c9f94f7cac5976707e806554c5f00ac85fae6a7d277e54e0820/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-464:{mountpoint:/var/lib/containers/storage/overlay/feda2bf93cdc4495419a4390a2415575677e5008a87feba5f6584d8545b6fb15/merged major:0 minor:464 fsType:overlay blockSize:0} overlay_0-474:{mountpoint:/var/lib/containers/storage/overlay/506e3a57f23358a62d9d66924896403c6b15c394e6f0b38b2347e36616c911e6/merged major:0 minor:474 fsType:overlay blockSize:0} overlay_0-476:{mountpoint:/var/lib/containers/storage/overlay/aaa059dd0187f4b58258c90539ffa69221faa503b3128c0a8b864c1b061ad461/merged major:0 minor:476 fsType:overlay blockSize:0} overlay_0-478:{mountpoint:/var/lib/containers/storage/overlay/3a458914d8114316dd0fa5e28137326229c77799e1059d900123842376627527/merged major:0 minor:478 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/6334d65302c4b77c66483cd075c39ee4fe79e3caac4ea3ce4c41b95b06d1a1f5/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-480:{mountpoint:/var/lib/containers/storage/overlay/00cf3a25bfdb387751fa482ccc0b523519a48122cac43c83f76b8c69f41e5c20/merged major:0 minor:480 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/52ef149b3dcadbe50e042c7913377b31b5c192811140c6d8b79de2c9ab3a9395/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-489:{mountpoint:/var/lib/containers/storage/overlay/978458e114570eab5e67b7fb62d7636806ac79d03a391cb304c51122f4c02699/merged major:0 minor:489 fsType:overlay blockSize:0} overlay_0-497:{mountpoint:/var/lib/containers/storage/overlay/11376e14b99fc8715a7e4f25c7208fd540ffa0d48d849b59734cc463147b17f7/merged major:0 minor:497 fsType:overlay blockSize:0} overlay_0-499:{mountpoint:/var/lib/containers/storage/overlay/81793e964467e9b7b2db5c579e6821c7a47c0a1c33e42e9227c478176d2b1449/merged major:0 minor:499 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/ef91136293460b46568dfc0f9d26264a9ea1328e500294a3ca55620998258677/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/bc110d19874b2bbf58a7832154c42dd1e15ff0eb8ba950586b669031f718fdcd/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-505:{mountpoint:/var/lib/containers/storage/overlay/bb79eabd54d7af58ff4dcbcd05fa43af4656081de8b93d89d7d8f301d857018f/merged major:0 minor:505 fsType:overlay blockSize:0} overlay_0-509:{mountpoint:/var/lib/containers/storage/overlay/7054ed486073a6ecc4011de9ca1ee672cfcd8a994f72856c3ddee9d485b4bb20/merged major:0 minor:509 fsType:overlay blockSize:0} overlay_0-511:{mountpoint:/var/lib/containers/storage/overlay/400b84c51d16198de100ff3e663c283691fedcddf1283d220827a5065a584bb7/merged major:0 minor:511 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/ed0fa5573540caa0e03989916ac6b00a32ad2c42ab40751134860e1c42482ec8/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-531:{mountpoint:/var/lib/containers/storage/overlay/491ce6d70adb1a0413d54e3a77fc6956ed53237ebe1a90199f6ddc69b8ead313/merged major:0 minor:531 fsType:overlay blockSize:0} overlay_0-533:{mountpoint:/var/lib/containers/storage/overlay/73e330e3e4341e1738293e42f1edea29164a6a1080d4642334d82bbfce3852f1/merged major:0 minor:533 fsType:overlay blockSize:0} overlay_0-535:{mountpoint:/var/lib/containers/storage/overlay/18a57c0eeb87c930799a225b3b34c75c627aae67e6249c9130a0bb547b5d6d02/merged major:0 minor:535 fsType:overlay blockSize:0} overlay_0-537:{mountpoint:/var/lib/containers/storage/overlay/17288f9ac93aadd4f4d93ced4d5016f21d2954c54734d946d1ad8dc0383d074d/merged major:0 minor:537 fsType:overlay blockSize:0} overlay_0-540:{mountpoint:/var/lib/containers/storage/overlay/075469862d96fb0414047bf6ae5e8bd77d1226dc15a333a6a46c8dc67a3aa775/merged major:0 minor:540 fsType:overlay blockSize:0} overlay_0-544:{mountpoint:/var/lib/containers/storage/overlay/a16417675ee45281d75bd42284ec75137be04a4dbf6bd8532ac0af765c45f1d0/merged major:0 minor:544 fsType:overlay blockSize:0} overlay_0-550:{mountpoint:/var/lib/containers/storage/overlay/8e3ba9931acbf7ac240b9cab709b59967b5d0d15ae0e1d40ece1f833907af61e/merged major:0 minor:550 fsType:overlay blockSize:0} overlay_0-553:{mountpoint:/var/lib/containers/storage/overlay/fa4deaff7cca85592a32f8098e8ce6df0ddeb17604b5786772b9fb814d231fd1/merged major:0 minor:553 fsType:overlay blockSize:0} overlay_0-558:{mountpoint:/var/lib/containers/storage/overlay/7be62cbe518181746a56bed9311cf37b0ff1f449631bfe553348088f25bc5255/merged major:0 minor:558 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/c00f160b816e259221575d5a241ced88af0ea2cf9db59987705cf0d6ee2ff5ae/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-564:{mountpoint:/var/lib/containers/storage/overlay/10de5d2ac15b9b523d5a95a135b6e8f9368edf51bba8f09c183e1d6c142e452f/merged major:0 minor:564 fsType:overlay blockSize:0} overlay_0-565:{mountpoint:/var/lib/containers/storage/overlay/9028713c4a72548926a7d3fd32f17a569b55c5a5840c38c7fde2d18991f1a306/merged major:0 minor:565 fsType:overlay blockSize:0} overlay_0-588:{mountpoint:/var/lib/containers/storage/overlay/12477d0b92484d82698cffbfc8064a810de29ef431c54683d310c51d01200ef9/merged major:0 minor:588 fsType:overlay blockSize:0} overlay_0-590:{mountpoint:/var/lib/containers/storage/overlay/010cde900b9467aacd9aff469cdae51d85552e4e2d1e36032faa39b643c5b07b/merged major:0 minor:590 fsType:overlay blockSize:0} overlay_0-596:{mountpoint:/var/lib/containers/storage/overlay/b95663a9cd4e5944ad27c72c33445ac8e4b0c15635cf3179a9835de884e0b0b1/merged major:0 minor:596 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/9d8b0598073bcc8d7331bacd1e5a2ffb09102eb7681db04d22c543f4ccfca5e8/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/a8b4e0f10bd1ec4e107461792c4f7571f88d44f41dac63e8ba60c8f584fb667f/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/94876325929a5ec95b62ccc5509a275520e1d652286a614c7d905191bbaec11c/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-611:{mountpoint:/var/lib/containers/storage/overlay/83274bab9476982ead453d64045d9154e278ba053c10f846383d9067da6c891e/merged major:0 minor:611 fsType:overlay blockSize:0} overlay_0-612:{mountpoint:/var/lib/containers/storage/overlay/2ce7ef03cd339819be461609c823009aec6fe3e00f143c98583a9ae06e234f11/merged major:0 minor:612 fsType:overlay blockSize:0} overlay_0-613:{mountpoint:/var/lib/containers/storage/overlay/821c596824d5c9e0f402b8906231df1f610a6cebe2033e6d12df4a63ceb06f46/merged major:0 minor:613 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/bf04e15213a802a01f63b27a1293e16aab99152bfe41da3d1385e37faa966443/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-622:{mountpoint:/var/lib/containers/storage/overlay/8a34c5bfafc12dbb1e0f77f6c8d02c91466263e2757178beac80dc2843efd445/merged major:0 minor:622 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/9fe893d7c61e44ea63eccffb887b389ae0f901d5e4789f316f225e753a6cd8bb/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-658:{mountpoint:/var/lib/containers/storage/overlay/694ae0eccbaf24eaaf2f456bb734ee917f5be1417e78430aad99bdb1a0d1112e/merged major:0 minor:658 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/4646bdca662eef4647809339939b4cac973f68fa4cc60f42be0a678ee9853dc8/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-661:{mountpoint:/var/lib/containers/storage/overlay/27e704e1a1e4758db293960a5effaada864738a957e129bd9e4ffba1fd012f0e/merged major:0 minor:661 fsType:overlay blockSize:0} overlay_0-663:{mountpoint:/var/lib/containers/storage/overlay/c6f9ad2eb553f304bdcd3d260612f71e31655765718cd4b7c1a5bcef19bb04bb/merged major:0 minor:663 fsType:overlay blockSize:0} overlay_0-665:{mountpoint:/var/lib/containers/storage/overlay/b42708a213e714927a91df93c3e1d2233cf4eb664631e13d8383cc4b557dd9a5/merged major:0 minor:665 fsType:overlay blockSize:0} overlay_0-667:{mountpoint:/var/lib/containers/storage/overlay/64b42f2ecc375f5c4502de93d3da43b859c442531473c70c36a05787f039bace/merged major:0 minor:667 fsType:overlay blockSize:0} overlay_0-669:{mountpoint:/var/lib/containers/storage/overlay/2faf7e377e221a72db5b6ae539a994a6ab4c8abdfdf1e6099be8df2ca8781e09/merged major:0 minor:669 fsType:overlay blockSize:0} overlay_0-671:{mountpoint:/var/lib/containers/storage/overlay/ce31117f90f58e6dcf4b7b311367c8fdb5c17c1c314570e451f89e8f37f24e4d/merged major:0 minor:671 fsType:overlay blockSize:0} overlay_0-673:{mountpoint:/var/lib/containers/storage/overlay/5aa4bcec97244fd4f9d43dc3166b6dde26129d340e494d5a55ade7b9e31eaf7c/merged major:0 minor:673 fsType:overlay blockSize:0} overlay_0-675:{mountpoint:/var/lib/containers/storage/overlay/23d1b7ed8b1ca360febb8a2c757a92878af287642d597dc0d017dedbb6be7ebc/merged major:0 minor:675 fsType:overlay blockSize:0} overlay_0-676:{mountpoint:/var/lib/containers/storage/overlay/ed94730ec0a7d97aac4c569fd34201b770225625b3216e5a832e4a37a5119d1f/merged major:0 minor:676 fsType:overlay blockSize:0} overlay_0-679:{mountpoint:/var/lib/containers/storage/overlay/6054b252eae48f3c3d94d7a90987218b7bccc34d1219772038d5e0c60541e1fe/merged major:0 minor:679 fsType:overlay blockSize:0} overlay_0-699:{mountpoint:/var/lib/containers/storage/overlay/9af1cca056a579b05f66ae6382daedf4e5928083d4d842109ccd756e434a0117/merged major:0 minor:699 fsType:overlay blockSize:0} overlay_0-701:{mountpoint:/var/lib/containers/storage/overlay/1e16b16575cca6af32ccca27187875a3d93260055e2eda00730e7d6dd0a804d4/merged major:0 minor:701 fsType:overlay blockSize:0} overlay_0-723:{mountpoint:/var/lib/containers/storage/overlay/81a31edf382e70355a03803230f11f4152ed923e47bf532ff8c032843b187056/merged major:0 minor:723 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/49611b5a25827ec97dfaafc31a73443b1859318e444a98a5b38184ba7d4619c1/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-732:{mountpoint:/var/lib/containers/storage/overlay/d91e56689f6c4bf22bec02a0af0412109d8a9d5c5234292e41a4c750dde3a880/merged major:0 minor:732 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/9e8436bc92a8a3296af5164d4c9d05a8c1c72d3e686484931070b5ac536d86e7/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-758:{mountpoint:/var/lib/containers/storage/overlay/61bd1e367cf2eb7933dce08afbf85f6b156e800cb07ad138c088a29f8615a923/merged major:0 minor:758 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/589d7a59825439a13f501df8dd29e9b1d1eecbd0b2884974edf6dcfe766b9c7f/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-760:{mountpoint:/var/lib/containers/storage/overlay/254394e4bc2bb2f5fe1ecba942dbc67de1d29fcbc734bf16ef454671b18f5830/merged major:0 minor:760 fsType:overlay blockSize:0} overlay_0-763:{mountpoint:/var/lib/containers/storage/overlay/06d44d9b7d22627930c9eaf50ac8038c327bd368350735cf73f1eee902e347f7/merged major:0 minor:763 fsType:overlay blockSize:0} overlay_0-770:{mountpoint:/var/lib/containers/storage/overlay/fdeb88566de10ec4a5f1dd71758c16fbf8d1f80ac32260a2ddadde1ef17397ad/merged major:0 minor:770 fsType:overlay blockSize:0} overlay_0-789:{mountpoint:/var/lib/containers/storage/overlay/3c7692c36c61159e9184ceeb3fb640d28f94fff6c1d07b71cf8642a2a82655e1/merged major:0 minor:789 fsType:overlay blockSize:0} overlay_0-796:{mountpoint:/var/lib/containers/storage/overlay/8dd51f9e37fad0a2ae025dc81f35e90f6cf2e3b3c9a798614bdcc1fc76664819/merged major:0 minor:796 fsType:overlay blockSize:0} overlay_0-798:{mountpoint:/var/lib/containers/storage/overlay/e89e358d53896cc4d53fc4ad029556e6a5dab9c09762904926edf216cd4035d5/merged major:0 minor:798 fsType:overlay blockSize:0} overlay_0-807:{mountpoint:/var/lib/containers/storage/overlay/128dd9fcc5a6d20d8f9590f4cac1f985f8c21721a7e502cbfb7623127cd5c9be/merged major:0 minor:807 fsType:overlay blockSize:0} overlay_0-813:{mountpoint:/var/lib/containers/storage/overlay/c3f43c7fd61258d7d11b33370013f780312ad98de599bbab6eab3ea5b7da3d74/merged major:0 minor:813 fsType:overlay blockSize:0} overlay_0-814:{mountpoint:/var/lib/containers/storage/overlay/7d4b53d2548bbeebe088c9e542f44c2e09c877b4bec30eb6b2cedc323770e4a0/merged major:0 minor:814 fsType:overlay blockSize:0} overlay_0-817:{mountpoint:/var/lib/containers/storage/overlay/f1557c81f1d7a8ff70b8e526044c723a80c4875e6abeb6a7e139d0d4e655e677/merged major:0 minor:817 fsType:overlay blockSize:0} overlay_0-819:{mountpoint:/var/lib/containers/storage/overlay/92db0f2d29206112ead98283b49ffacdc0f4a6e21debcefd7fdf0833ece479d1/merged major:0 minor:819 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/5b6715530a2fde1e11e0d1403a69f4334d7c47efcaa06fda3442cd961e7c0b73/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-823:{mountpoint:/var/lib/containers/storage/overlay/96c0c4bd97e574ec1534a39b50e9c9037c73cf0f1ef71766ffbd45a909d6549f/merged major:0 minor:823 fsType:overlay blockSize:0} overlay_0-834:{mountpoint:/var/lib/containers/storage/overlay/5af5626331828cf0aea51b9bc880e172693b524e175039f64785ad60f256cd80/merged major:0 minor:834 fsType:overlay blockSize:0} overlay_0-836:{mountpoint:/var/lib/containers/storage/overlay/2c8c9691a2889bae60cc882bbc7736fe513ed80585f90d2afe03080485ee150c/merged major:0 minor:836 fsType:overlay blockSize:0} overlay_0-848:{mountpoint:/var/lib/containers/storage/overlay/723d667d2a31a1732db9c2cc831d40c54a296957799b45da724ebf938b8b8485/merged major:0 minor:848 fsType:overlay blockSize:0} overlay_0-850:{mountpoint:/var/lib/containers/storage/overlay/f0d7b0a7790d706061bd2571c5ac6e18a346a842c393fbdd0a650551e3ae91e5/merged major:0 minor:850 fsType:overlay blockSize:0} overlay_0-856:{mountpoint:/var/lib/containers/storage/overlay/a7214a08f47c72e24d0bc82109b3b4e23be52800c535363fd058796d3538b252/merged major:0 minor:856 fsType:overlay blockSize:0} overlay_0-858:{mountpoint:/var/lib/containers/storage/overlay/7be57e3dfb6a5ebdb9c41bc8409ff4824059787b445d06121df6c75531ce94fc/merged major:0 minor:858 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/363f19a677f750b37bc9aed70064557758d6f91f335f0f8f45b1c3f90fbafcca/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-873:{mountpoint:/var/lib/containers/storage/overlay/861763f3fbab953d39f94a4a105d6295bb33389084f5f6f91e510dd41e5a3277/merged major:0 minor:873 fsType:overlay blockSize:0} overlay_0-875:{mountpoint:/var/lib/containers/storage/overlay/f1bf14c3723edb47d0c5ee71200be3395ebe2fe8166a9e36a12fefc60ab852e5/merged major:0 minor:875 fsType:overlay blockSize:0} overlay_0-879:{mountpoint:/var/lib/containers/storage/overlay/1e93edc7d0b4fa6364135a6cea0e03811b033f1ece89fb33f5397fab45220fa6/merged major:0 minor:879 fsType:overlay blockSize:0} overlay_0-887:{mountpoint:/var/lib/containers/storage/overlay/8523ddbaa6f263c889448ac9a452e21c9f41473461f6a543fc82643054e49fcd/merged major:0 minor:887 fsType:overlay blockSize:0} overlay_0-889:{mountpoint:/var/lib/containers/storage/overlay/8d8f7bfd2617a5d0f0da0820fe8626052f78d0d00913ce330549a7ff21ce0a63/merged major:0 minor:889 fsType:overlay blockSize:0} overlay_0-891:{mountpoint:/var/lib/containers/storage/overlay/a29f2909c03096a346114671a1243f1f68074c710fa650235b354e0ba1017596/merged major:0 minor:891 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/6b1fe1b61699009797c9fca9bdc5e7f56929e95acb38c481fdb925d8f45d893e/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-895:{mountpoint:/var/lib/containers/storage/overlay/6b8c0daf19a35c34a46afbc119ddea8fe253b9cf2bb0f61f4bc67471f8d06ecf/merged major:0 minor:895 fsType:overlay blockSize:0} overlay_0-905:{mountpoint:/var/lib/containers/storage/overlay/b01ce08eadbe2f1737c01a8c16c282f09c546d6fe90aa5a87d1783df693b49da/merged major:0 minor:905 fsType:overlay blockSize:0} overlay_0-907:{mountpoint:/var/lib/containers/storage/overlay/f993ed3df53d0a4929deb9c8d2e7b9b792f2995c8ec6f292c35d829e31d0baea/merged major:0 minor:907 fsType:overlay blockSize:0} overlay_0-94:{mountpoint:/var/lib/containers/storage/overlay/2d5f30049a4aacc0f26ae99bdb75cee46d54f36d0323acf02c6418975ed5d45e/merged major:0 minor:94 fsType:overlay blockSize:0}] Mar 08 03:13:18.055295 master-0 kubenswrapper[13046]: I0308 03:13:18.053929 13046 manager.go:217] Machine: {Timestamp:2026-03-08 03:13:18.053131505 +0000 UTC m=+0.131898732 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b3a4f7075cb34fef92c3bca0876fb6a9 SystemUUID:b3a4f707-5cb3-4fef-92c3-bca0876fb6a9 BootID:ab1d3f01-9ab7-4687-a25d-e07ad2358a90 Filesystems:[{Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/92d33cd4d391db44fa59251ab4f865e88339c3b4327ec053c536f051a308ce2b/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-663 DeviceMajor:0 DeviceMinor:663 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-611 DeviceMajor:0 DeviceMinor:611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-763 DeviceMajor:0 DeviceMinor:763 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c380d8376b93cf0d471da9a093b8dab4577d756ac31e0b75746f35b913cbd11/userdata/shm DeviceMajor:0 DeviceMinor:881 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a8085b4e985b562b8d336416a01305d62c87bbe11cf1a12349c7ff41540427d2/userdata/shm DeviceMajor:0 DeviceMinor:490 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-511 DeviceMajor:0 DeviceMinor:511 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0781e6af-f5b5-40f7-bb7f-5bc6978b4957/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:838 Capacity:200003584 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-875 DeviceMajor:0 DeviceMinor:875 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~projected/kube-api-access-m97fm DeviceMajor:0 DeviceMinor:787 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/acdbe731524a429914acc46c6b0fc8566c114ceaee29921a9a57d998fb2b4214/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/11e59ac64a0debaedb355a161f6731179e42817b56a520a8672d4ac7d8e22f1b/userdata/shm DeviceMajor:0 DeviceMinor:237 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/aa781f72-e72f-47e1-b37a-977340c182c8/volumes/kubernetes.io~projected/kube-api-access-b25w4 DeviceMajor:0 DeviceMinor:304 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5e30ced5c1465fa4b9f72a89783db5d665983a50641b79b25a38f7c94e44add4/userdata/shm DeviceMajor:0 DeviceMinor:463 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:640 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-758 DeviceMajor:0 DeviceMinor:758 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-836 DeviceMajor:0 DeviceMinor:836 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-895 DeviceMajor:0 DeviceMinor:895 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~projected/kube-api-access-hbqkj DeviceMajor:0 DeviceMinor:235 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:634 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-723 DeviceMajor:0 DeviceMinor:723 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e81d3c37-e8d7-44c8-973e-13992380ce85/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:691 Capacity:200003584 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-544 DeviceMajor:0 DeviceMinor:544 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-497 DeviceMajor:0 DeviceMinor:497 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-658 DeviceMajor:0 DeviceMinor:658 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-661 DeviceMajor:0 DeviceMinor:661 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/02cfbf308f15b996580ef596b0330755f35812b59aaae506481618fd237dc21a/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-553 DeviceMajor:0 DeviceMinor:553 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:827 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~projected/kube-api-access-rtt8w DeviceMajor:0 DeviceMinor:871 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/33074446cdbc88121d8e124ce9ef4086de417c43c65ed95b23d3c327d6455998/userdata/shm DeviceMajor:0 DeviceMinor:910 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1/volumes/kubernetes.io~projected/kube-api-access-d5c5z DeviceMajor:0 DeviceMinor:103 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-173 DeviceMajor:0 DeviceMinor:173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:637 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:638 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-814 DeviceMajor:0 DeviceMinor:814 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0e569889-4759-4046-b0ed-e550078521c6/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:785 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-476 DeviceMajor:0 DeviceMinor:476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7e324f6c-ee4c-42bc-b241-9c6938749854/volumes/kubernetes.io~projected/kube-api-access-qgw92 DeviceMajor:0 DeviceMinor:592 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:767 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/11a698247fc6f9c54a51413289ea242dad54ebc3d1193e702f60f1dd98b867ee/userdata/shm DeviceMajor:0 DeviceMinor:783 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:230 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c725404c21fc1f8e1d386945b71a5debdae8332b549c2d533bc3d6a6b387f25/userdata/shm DeviceMajor:0 DeviceMinor:239 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~projected/kube-api-access-pb2xh DeviceMajor:0 DeviceMinor:780 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~projected/kube-api-access-trhxt DeviceMajor:0 DeviceMinor:223 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e51dfab7b748272e1450108a48121b5a8c713aa6f58f346985d6b3286efbefb3/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-391 DeviceMajor:0 DeviceMinor:391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26d2cf7f711f68c3d3c9308afe0087b283b13f0a913f5e231daad29627564b0c/userdata/shm DeviceMajor:0 DeviceMinor:607 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-891 DeviceMajor:0 DeviceMinor:891 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:919 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ac8609fedee0058569da7d8a957a2d1f873b1e97869f7515236d24de03c1a1d3/userdata/shm DeviceMajor:0 DeviceMinor:309 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/306b824f-dcfb-4e69-9a23-64dfbae61852/volumes/kubernetes.io~projected/kube-api-access-4pxwl DeviceMajor:0 DeviceMinor:378 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:414 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:633 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes/kubernetes.io~projected/kube-api-access-6bcd7 DeviceMajor:0 DeviceMinor:828 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~projected/kube-api-access-hxqnd DeviceMajor:0 DeviceMinor:227 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/755ba253608e09cf00383e10d3ba14eaf590dfbbb3b829aad725eb83d1e338ba/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:379 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa9984f4888ad176cca86a609f919e722ac828a4b46cdcdc3a09bfd6dca13141/userdata/shm DeviceMajor:0 DeviceMinor:586 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cc2c80df51f36394dc9dae6d283de990f51bae7d557e41ccd7171b38db33c170/userdata/shm DeviceMajor:0 DeviceMinor:117 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/kube-api-access-2xfxd DeviceMajor:0 DeviceMinor:233 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-489 DeviceMajor:0 DeviceMinor:489 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/4108f513-acef-473a-ab03-f3761b2bd0d8/volumes/kubernetes.io~projected/kube-api-access-qlj9x DeviceMajor:0 DeviceMinor:255 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9/userdata/shm DeviceMajor:0 DeviceMinor:809 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/9cf6ce1a-c203-4033-86be-be16694a9062/volumes/kubernetes.io~projected/kube-api-access-ml7t9 DeviceMajor:0 DeviceMinor:561 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:632 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0d9abd668d0f5e4396724f5ea282e6dd1f64c0edb81c83294ae09a514ba683b4/userdata/shm DeviceMajor:0 DeviceMinor:652 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-856 DeviceMajor:0 DeviceMinor:856 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-699 DeviceMajor:0 DeviceMinor:699 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09071d6fb6c4ca0d4b528c7dd15e2cf71933d8ef9eb4afe4554738db9dbc1331/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-537 DeviceMajor:0 DeviceMinor:537 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:581 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-596 DeviceMajor:0 DeviceMinor:596 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7ea81472-8a81-45ec-a07d-8710f47a927d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:694 Capacity:200003584 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-807 DeviceMajor:0 DeviceMinor:807 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba9496ed-060e-4118-9da6-89b82bd49263/volumes/kubernetes.io~projected/kube-api-access-6mv56 DeviceMajor:0 DeviceMinor:246 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-376 DeviceMajor:0 DeviceMinor:376 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-415 DeviceMajor:0 DeviceMinor:415 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:483 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ae9034db4782bac8f7d81887d48fe45bb1b3f6c402f0bfca0b19827ba74bb1e6/userdata/shm DeviceMajor:0 DeviceMinor:642 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed/userdata/shm DeviceMajor:0 DeviceMinor:794 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~projected/kube-api-access-wf24l DeviceMajor:0 DeviceMinor:269 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:442 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:118 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9b8daf0d86e5f849d0599050d09899d17f88f967908e192636a1248a1c56494f/userdata/shm DeviceMajor:0 DeviceMinor:247 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~projected/kube-api-access-lh2rs DeviceMajor:0 DeviceMinor:445 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/bda3bd48-6de3-49b0-b2ce-96d97e97f178/volumes/kubernetes.io~projected/kube-api-access-885mp DeviceMajor:0 DeviceMinor:582 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~projected/kube-api-access-6bpwx DeviceMajor:0 DeviceMinor:844 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-564 DeviceMajor:0 DeviceMinor:564 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e9eec45b78006702c1cd45f46eb5970fb8a098410841c5321ec7f96f7dedcf63/userdata/shm DeviceMajor:0 DeviceMinor:434 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-612 DeviceMajor:0 DeviceMinor:612 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5084fee9f78fd3409fb341ad2acde32d4d944cb2a228b141d353e9f022872f48/userdata/shm DeviceMajor:0 DeviceMinor:845 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:458 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:636 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:762 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~projected/kube-api-access-s8dn9 DeviceMajor:0 DeviceMinor:225 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/eedc7538-9cc6-4bf5-9628-e278310d796b/volumes/kubernetes.io~projected/kube-api-access-lghdk DeviceMajor:0 DeviceMinor:258 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee50d0908ef8218db69129064969d66d86843ed87cd667dcc60ef7e0d8a70f21/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fdfd8c3453c2b861e5064997dedc9844e33b8b7f827e7ec1b0610a3364f7b575/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-550 DeviceMajor:0 DeviceMinor:550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-565 DeviceMajor:0 DeviceMinor:565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-676 DeviceMajor:0 DeviceMinor:676 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/00fbf6ab4c06c34054572affb4623a1ba6b78e4e0116048ccabe5fb462c0f796/userdata/shm DeviceMajor:0 DeviceMinor:572 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-907 DeviceMajor:0 DeviceMinor:907 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes/kubernetes.io~projected/kube-api-access-4nrpc DeviceMajor:0 DeviceMinor:250 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-339 DeviceMajor:0 DeviceMinor:339 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/de90d207-06d6-4778-b1b0-9020b1f2a881/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:441 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-535 DeviceMajor:0 DeviceMinor:535 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-613 DeviceMajor:0 DeviceMinor:613 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e9fad8077e3e386a60b5dc7bb7e5c6bd154a1e1fbbbc76393683792778d9fac5/userdata/shm DeviceMajor:0 DeviceMinor:644 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/04168952ada741f79304ee9b25e1212567fc1ce3d719a0050a26b711accbbea4/userdata/shm DeviceMajor:0 DeviceMinor:647 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/76ceb013-e999-4f15-bf25-f8dcd2647f9f/volumes/kubernetes.io~projected/kube-api-access-s8qqr DeviceMajor:0 DeviceMinor:115 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~projected/kube-api-access-pv8wt DeviceMajor:0 DeviceMinor:139 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f94d3d5f4c829c12aa9cdd79c3a8b919521e9b4705852dc7634f236661eedb2a/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/volumes/kubernetes.io~projected/kube-api-access-rzmjd DeviceMajor:0 DeviceMinor:373 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:635 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/68309159-130a-4ffa-acec-95dc4b795b8f/volumes/kubernetes.io~projected/kube-api-access-k58bm DeviceMajor:0 DeviceMinor:571 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~projected/kube-api-access-84q5n DeviceMajor:0 DeviceMinor:563 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-760 DeviceMajor:0 DeviceMinor:760 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:900 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e0503e27a82dda972ff0c87ff11d92e781cda40c87856d2a275affa50a9cc223/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a9677e44cf88488e86493a105f95a756fe5dcdb4e68b6740b2fed8252e50fe4c/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:260 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/05f6bb5a907a01ee11459dd7672be911eeedf74f96a5a1e584011854a9d81b18/userdata/shm DeviceMajor:0 DeviceMinor:385 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-701 DeviceMajor:0 DeviceMinor:701 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/232775f5a2d5493c0a82abf166454589f3f2855c9d7aba021d33f9d3267ef323/userdata/shm DeviceMajor:0 DeviceMinor:727 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a6e3c3e5ef7bc2875a4438317596870f92270cd7e8853931713a647f7c41386/userdata/shm DeviceMajor:0 DeviceMinor:467 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:487 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/702795b7a3b9492f17a3552f3377a1320bf2ba8da965c8533a8f5f8dc47e6545/userdata/shm DeviceMajor:0 DeviceMinor:756 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/volumes/kubernetes.io~projected/kube-api-access-fvzs9 DeviceMajor:0 DeviceMinor:765 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-889 DeviceMajor:0 DeviceMinor:889 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-850 DeviceMajor:0 DeviceMinor:850 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ca93e52234c88361e8e1b6980fb2210aa6a1393de070fbd3e9cce73bed8cabfa/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-280 DeviceMajor:0 DeviceMinor:280 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-671 DeviceMajor:0 DeviceMinor:671 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:713 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/af653e87-ce5f-4f1a-a20d-233c563694ba/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:722 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/5a2c9576-f7bd-4ac5-a7fe-530f26642f97/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:560 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes/kubernetes.io~projected/kube-api-access-h8fg7 DeviceMajor:0 DeviceMinor:920 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~projected/kube-api-access-rf57p DeviceMajor:0 DeviceMinor:226 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~projected/kube-api-access-llf9g DeviceMajor:0 DeviceMinor:270 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e023598af62616102fc3da25dddc7bed12c4ad58ecf15ebabad27596e663a5e7/userdata/shm DeviceMajor:0 DeviceMinor:645 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fe33f926-9348-4498-a892-d2becaeecc14/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:792 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-393 DeviceMajor:0 DeviceMinor:393 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/40b87ce5bc138e32a2067ed918b783e37e64b1d585f2b0c8e8982345833631fd/userdata/shm DeviceMajor:0 DeviceMinor:491 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~projected/kube-api-access-m7hzl DeviceMajor:0 DeviceMinor:123 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~projected/kube-api-access-q9hb9 DeviceMajor:0 DeviceMinor:126 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~projected/kube-api-access-4lb9w DeviceMajor:0 DeviceMinor:229 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-417 DeviceMajor:0 DeviceMinor:417 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-531 DeviceMajor:0 DeviceMinor:531 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2dc664e3-7f37-4fba-8104-544ffb18c1bd/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:602 Capacity:200003584 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-879 DeviceMajor:0 DeviceMinor:879 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-905 DeviceMajor:0 DeviceMinor:905 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/823da3681a2bed2886a7203628b1c7e57f1dcee98965a56a67b863acb0d62939/userdata/shm DeviceMajor:0 DeviceMinor:146 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-474 DeviceMajor:0 DeviceMinor:474 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09504f5af1d7c056fa184727bb790ba83f7a308b15d1c9ebf34076ea08bbf988/userdata/shm DeviceMajor:0 DeviceMinor:116 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2bbe9b81-0efb-4caa-bacd-55348cd392c6/volumes/kubernetes.io~projected/kube-api-access-5vt6t DeviceMajor:0 DeviceMinor:232 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~projected/kube-api-access-xpd47 DeviceMajor:0 DeviceMinor:308 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~projected/kube-api-access-m2vmz DeviceMajor:0 DeviceMinor:231 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-588 DeviceMajor:0 DeviceMinor:588 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:485 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/acb74744-fb99-4663-a7d0-7bae2db205e9/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:600 Capacity:200003584 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-94 DeviceMajor:0 DeviceMinor:94 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f/userdata/shm DeviceMajor:0 DeviceMinor:598 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/6a9d0240-fc00-4d78-9458-8f53b1876f1b/volumes/kubernetes.io~projected/kube-api-access-b2vvq DeviceMajor:0 DeviceMinor:804 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~projected/kube-api-access-422p2 DeviceMajor:0 DeviceMinor:125 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/7d23557f-6bb1-46ce-a56e-d0011c576125/volumes/kubernetes.io~projected/kube-api-access-bkfwp DeviceMajor:0 DeviceMinor:222 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-675 DeviceMajor:0 DeviceMinor:675 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-284 DeviceMajor:0 DeviceMinor:284 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-667 DeviceMajor:0 DeviceMinor:667 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72/userdata/shm DeviceMajor:0 DeviceMinor:839 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/aadbbe97-2a03-40da-846d-252e29661f67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-383 DeviceMajor:0 DeviceMinor:383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-789 DeviceMajor:0 DeviceMinor:789 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-770 DeviceMajor:0 DeviceMinor:770 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176/userdata/shm DeviceMajor:0 DeviceMinor:811 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8c0192f3-2e60-42c6-9836-c70a9fa407d5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab315e2bf3bc162089d02708f674e1774b5dd32ef86eb2274f79f03cc0e87f08/userdata/shm DeviceMajor:0 DeviceMinor:418 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5888a4a5d71348b379d6e9015d48df3d9c05837487495c483686efe0c2418c25/userdata/shm DeviceMajor:0 DeviceMinor:468 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-505 DeviceMajor:0 DeviceMinor:505 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3b71760d7242393a221c7ae1af331545931cbaec81501f72daac6ae1c2882487/userdata/shm DeviceMajor:0 DeviceMinor:648 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf0f6570fe84a3058c6d6122c2d052c6c8b6d42a5f14c4cbfb5452cbc6866dd1/userdata/shm DeviceMajor:0 DeviceMinor:877 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-887 DeviceMajor:0 DeviceMinor:887 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:228 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~projected/kube-api-access-wk7jv DeviceMajor:0 DeviceMinor:782 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/23bb11ba3a23ff0dd81fe52f6086906b383ab372fbf2896599dcd7f371be92ab/userdata/shm DeviceMajor:0 DeviceMinor:274 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:409 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:641 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~projected/kube-api-access-982r4 DeviceMajor:0 DeviceMinor:768 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:921 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/1092f2a6-865c-4706-bba7-068621e85ebc/volumes/kubernetes.io~projected/kube-api-access-llwh7 DeviceMajor:0 DeviceMinor:922 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c9f377bf-79c5-4425-b5d1-256961835f62/volumes/kubernetes.io~projected/kube-api-access-6nhh9 DeviceMajor:0 DeviceMinor:380 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-443 DeviceMajor:0 DeviceMinor:443 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d60de575ffc4ab26e9f652593a89b44b4516637c5944d999700e15162300a100/userdata/shm DeviceMajor:0 DeviceMinor:650 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a08ddb-1045-4631-ba52-93f3046ebd0a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-669 DeviceMajor:0 DeviceMinor:669 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c3729e29-4c57-4f9b-8202-a87fd3a9a722/volumes/kubernetes.io~projected/kube-api-access-s27xv DeviceMajor:0 DeviceMinor:901 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0fd9eacf2ac04bcf14530f4abf810041d23d8565f4925575dea0bdaa4239be79/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/6432d23b-a55a-4131-83d5-5f16419809dd/volumes/kubernetes.io~projected/kube-api-access-7jk4m DeviceMajor:0 DeviceMinor:241 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-267 DeviceMajor:0 DeviceMinor:267 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfe0357f-dab4-4424-869c-f6070b411a35/volumes/kubernetes.io~projected/kube-api-access-w5xsp DeviceMajor:0 DeviceMinor:583 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/17eaab63-9ba9-4a4a-891d-a76aa3f03b46/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:786 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/275be8d3-df30-46f7-9d0a-806e404dfd57/volumes/kubernetes.io~projected/kube-api-access-fb4bq DeviceMajor:0 DeviceMinor:263 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/887566f7bfcbbe929a3906442fbf37654b94c1b82760831978893cfbf803fe8a/userdata/shm DeviceMajor:0 DeviceMinor:465 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/volumes/kubernetes.io~projected/kube-api-access-4nwgh DeviceMajor:0 DeviceMinor:484 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-590 DeviceMajor:0 DeviceMinor:590 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/febf6a91-8b78-4b22-93b9-155cb7761fc4/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:766 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f330f5d17188d70a58eecf3d0a2330b70f4408aee114d8d6465e47081bd71e07/userdata/shm DeviceMajor:0 DeviceMinor:374 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/1a6e3f01-0f22-4961-b450-56aca5477943/volumes/kubernetes.io~projected/kube-api-access-qj8dt DeviceMajor:0 DeviceMinor:779 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-796 DeviceMajor:0 DeviceMinor:796 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/50ab8f71-42b8-4967-8a0b-016647c59a37/volumes/kubernetes.io~projected/kube-api-access-h9jsw DeviceMajor:0 DeviceMinor:562 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ca736d5c0f4dfec580e9c43c992c5b2d64a84418a876f71fc7a9325b6f9c563/userdata/shm DeviceMajor:0 DeviceMinor:788 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3b767f72fbe851c0148683712fba4f0872103808c8eb0533886fa5261badacc5/userdata/shm DeviceMajor:0 DeviceMinor:495 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-798 DeviceMajor:0 DeviceMinor:798 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~projected/kube-api-access-qt69c DeviceMajor:0 DeviceMinor:778 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/4f822854-b9ac-46f2-b03b-e7215fba9208/volumes/kubernetes.io~projected/kube-api-access-rk5ll DeviceMajor:0 DeviceMinor:224 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3178dfc0-a35e-418e-a954-cd919b8af88c/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:460 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:440 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-533 DeviceMajor:0 DeviceMinor:533 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/770f1062326f2c1fcb8406b21c197925d94d8f11835c27505f3a03f752984724/userdata/shm DeviceMajor:0 DeviceMinor:584 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa875da543c11c30db67f621b636a4334559efa7a8f51023550cfe8454360f9c/userdata/shm DeviceMajor:0 DeviceMinor:610 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e74c8bb2-e063-4b60-b3fe-651aa534d029/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:454 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-848 DeviceMajor:0 DeviceMinor:848 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/111fa7be9df663403130635550a8dd29af3564ad73e3800569dcb7f1fe8c2849/userdata/shm DeviceMajor:0 DeviceMinor:431 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143 Mar 08 03:13:18.055775 master-0 kubenswrapper[13046]: 315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/982ea338-c7be-4776-9bb7-113834c54aaa/volumes/kubernetes.io~projected/kube-api-access-r8pfx DeviceMajor:0 DeviceMinor:98 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf93333-b537-4f23-9c77-6a245b290fe3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~projected/kube-api-access-8zh5b DeviceMajor:0 DeviceMinor:271 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-446 DeviceMajor:0 DeviceMinor:446 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8f99f81a-fd2d-432e-a3bc-e451342650b1/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:462 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:439 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cb95d850788ef393eb0f1ea09b2f5ec3ff3892998a30374932e76aea89c669e6/userdata/shm DeviceMajor:0 DeviceMinor:653 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-858 DeviceMajor:0 DeviceMinor:858 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:773 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c2612900a8ac574ae0f0de5042c80b40bcd11d67937802f2e2a9d7d11e0fa929/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/70fba73e-c201-4866-bc69-64892ea5bdca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-509 DeviceMajor:0 DeviceMinor:509 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:127 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:243 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-278 DeviceMajor:0 DeviceMinor:278 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-296 DeviceMajor:0 DeviceMinor:296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dac2b210-2fbb-4d25-a0ea-1825259cee3b/volumes/kubernetes.io~projected/kube-api-access-ntd2k DeviceMajor:0 DeviceMinor:639 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-673 DeviceMajor:0 DeviceMinor:673 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c31f7cee-d21d-4c23-af9b-1e0180b12e1e/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:307 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/d358134e-5625-492c-b4f7-460798631270/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-665 DeviceMajor:0 DeviceMinor:665 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a153dffc082c8d8e34a6c6e6c0c21f4bb223cf1b6ae19843ae82a4a21f8d697f/userdata/shm DeviceMajor:0 DeviceMinor:829 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/caa3a50c-1291-4152-a48a-f7c7b49627db/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:843 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/f99d6808-9fec-402d-93f7-41575a5a0a08/volumes/kubernetes.io~projected/kube-api-access-556dx DeviceMajor:0 DeviceMinor:488 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-732 DeviceMajor:0 DeviceMinor:732 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9262fbf1bea820c1d6587d88007865b18e2191381d424746083eaa6434ea0fcd/userdata/shm DeviceMajor:0 DeviceMinor:771 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/fd6b827c-70b0-47ed-b07c-c696343248a8/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:459 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-480 DeviceMajor:0 DeviceMinor:480 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:350 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes/kubernetes.io~projected/kube-api-access-mdxtt DeviceMajor:0 DeviceMinor:114 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4547ec615a98dfc1a4d3f423cf139e6774712d10eb4a01d3d753a13dcc2d3fd/userdata/shm DeviceMajor:0 DeviceMinor:790 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/53254b19-b5b3-4f97-bc64-37be8b2a41b7/volumes/kubernetes.io~projected/kube-api-access-dskxf DeviceMajor:0 DeviceMinor:486 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:/var/lib/kubelet/pods/d83aa242-606f-4adc-b689-4aa89625b533/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:628 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-464 DeviceMajor:0 DeviceMinor:464 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30/userdata/shm DeviceMajor:0 DeviceMinor:601 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47b9c8e39f771f4a9c9b48442e3a3c8bc53bf486bacb3ac02dc486b0fde5415d/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-823 DeviceMajor:0 DeviceMinor:823 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-819 DeviceMajor:0 DeviceMinor:819 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b83ab56c-e28d-4e82-ae8f-92649a1448ed/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:234 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df48884077bbd8fba7227e7cebbab4db2812d597832db429a888dd44decf9996/userdata/shm DeviceMajor:0 DeviceMinor:381 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-499 DeviceMajor:0 DeviceMinor:499 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33c15b06-a21e-411f-b324-3ae0c7f0e9a4/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:777 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-622 DeviceMajor:0 DeviceMinor:622 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/e71caa06-6ce7-47c9-a267-21f6b6af9247/volumes/kubernetes.io~projected/kube-api-access-zrdxk DeviceMajor:0 DeviceMinor:249 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-432 DeviceMajor:0 DeviceMinor:432 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-478 DeviceMajor:0 DeviceMinor:478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-834 DeviceMajor:0 DeviceMinor:834 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b05d5093-20f4-42d5-9db3-811e049cc1b6/volumes/kubernetes.io~projected/kube-api-access-ddxbs DeviceMajor:0 DeviceMinor:755 Capacity:32475521024 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-817 DeviceMajor:0 DeviceMinor:817 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cb298ff85bc6afefe78d9670cec4232d77064bf8eb867d648f99dcfde97ded03/userdata/shm DeviceMajor:0 DeviceMinor:805 Capacity:67108864 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-813 DeviceMajor:0 DeviceMinor:813 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-873 DeviceMajor:0 DeviceMinor:873 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-540 DeviceMajor:0 DeviceMinor:540 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-558 DeviceMajor:0 DeviceMinor:558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-679 DeviceMajor:0 DeviceMinor:679 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:00fbf6ab4c06c34 MacAddress:1a:62:74:30:71:1a Speed:10000 Mtu:8900} {Name:04168952ada741f MacAddress:a2:ed:f9:d5:a8:62 Speed:10000 Mtu:8900} {Name:047da584d2529f2 MacAddress:ae:1c:f7:55:49:48 Speed:10000 Mtu:8900} {Name:05f6bb5a907a01e MacAddress:16:76:ba:e9:fd:3f Speed:10000 Mtu:8900} {Name:09504f5af1d7c05 MacAddress:66:62:95:ab:d0:3b Speed:10000 Mtu:8900} {Name:0d9abd668d0f5e4 MacAddress:fe:bc:1b:28:15:43 Speed:10000 Mtu:8900} {Name:111fa7be9df6634 MacAddress:52:7e:aa:90:68:19 Speed:10000 Mtu:8900} {Name:11a698247fc6f9c MacAddress:52:4a:a3:f5:cf:65 Speed:10000 Mtu:8900} {Name:11e59ac64a0deba MacAddress:82:bf:78:a7:94:02 Speed:10000 Mtu:8900} {Name:23bb11ba3a23ff0 MacAddress:6a:d8:b1:44:5d:e1 Speed:10000 Mtu:8900} {Name:26d2cf7f711f68c MacAddress:82:99:75:47:e8:ef Speed:10000 Mtu:8900} {Name:33074446cdbc881 MacAddress:a2:c6:35:a3:99:b5 Speed:10000 Mtu:8900} {Name:3b71760d7242393 MacAddress:c6:45:7d:8e:7b:ff Speed:10000 Mtu:8900} {Name:40b87ce5bc138e3 MacAddress:8a:a6:d7:e1:91:a6 Speed:10000 Mtu:8900} {Name:47b9c8e39f771f4 MacAddress:8a:45:9a:08:24:e8 Speed:10000 Mtu:8900} {Name:5084fee9f78fd34 MacAddress:22:01:66:9c:29:bf Speed:10000 Mtu:8900} {Name:5888a4a5d71348b MacAddress:92:9a:28:63:c1:3b Speed:10000 Mtu:8900} {Name:5a6e3c3e5ef7bc2 MacAddress:c2:f5:46:1a:2b:6b Speed:10000 Mtu:8900} {Name:5e30ced5c1465fa MacAddress:52:23:c3:7b:2a:e5 Speed:10000 Mtu:8900} {Name:5f838d68f76fe62 MacAddress:ea:8d:bd:86:90:85 Speed:10000 Mtu:8900} {Name:6c725404c21fc1f MacAddress:7e:e9:05:3d:81:17 Speed:10000 Mtu:8900} {Name:702795b7a3b9492 MacAddress:06:e3:22:fb:a4:e0 Speed:10000 Mtu:8900} {Name:755ba253608e09c MacAddress:56:48:6c:c6:85:af Speed:10000 Mtu:8900} {Name:770f1062326f2c1 MacAddress:5a:2b:52:b8:e5:3b Speed:10000 Mtu:8900} {Name:774988f3ef3d0b5 MacAddress:e2:cd:ec:3e:96:c3 Speed:10000 Mtu:8900} {Name:7ca736d5c0f4dfe MacAddress:8a:b6:7a:7d:2b:86 Speed:10000 Mtu:8900} {Name:887566f7bfcbbe9 MacAddress:22:c0:4e:1e:b9:b9 Speed:10000 Mtu:8900} {Name:9262fbf1bea820c MacAddress:36:12:fe:38:51:4e Speed:10000 Mtu:8900} {Name:92d33cd4d391db4 MacAddress:ce:d2:c7:63:cf:fe Speed:10000 Mtu:8900} {Name:9afe00eb0e4be58 MacAddress:ae:43:ed:1a:58:59 Speed:10000 Mtu:8900} {Name:9b8daf0d86e5f84 MacAddress:22:19:ce:7e:45:75 Speed:10000 Mtu:8900} {Name:9c380d8376b93cf MacAddress:32:87:85:70:e4:91 Speed:10000 Mtu:8900} {Name:a338d440c2bb75e MacAddress:aa:2a:85:12:d7:fe Speed:10000 Mtu:8900} {Name:a8085b4e985b562 MacAddress:ce:82:ae:f6:8a:b8 Speed:10000 Mtu:8900} {Name:a9677e44cf88488 MacAddress:5e:8c:9d:84:8f:c8 Speed:10000 Mtu:8900} {Name:aa875da543c11c3 MacAddress:c2:8a:da:c7:9c:72 Speed:10000 Mtu:8900} {Name:ab315e2bf3bc162 MacAddress:92:61:27:9e:36:73 Speed:10000 Mtu:8900} {Name:ac8609fedee0058 MacAddress:12:8b:37:30:4c:6a Speed:10000 Mtu:8900} {Name:ae9034db4782bac MacAddress:02:1d:33:4a:40:49 Speed:10000 Mtu:8900} {Name:b4547ec615a98df MacAddress:ee:c8:96:b1:15:96 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:76:1e:7d:2a:6e:ef Speed:0 Mtu:8900} {Name:ca93e52234c8836 MacAddress:2e:21:79:13:42:36 Speed:10000 Mtu:8900} {Name:cb298ff85bc6afe MacAddress:06:a7:00:d6:62:ad Speed:10000 Mtu:8900} {Name:cb95d850788ef39 MacAddress:1e:12:0f:ef:04:3f Speed:10000 Mtu:8900} {Name:cc2c80df51f3639 MacAddress:d6:b0:26:c5:89:15 Speed:10000 Mtu:8900} {Name:d60de575ffc4ab2 MacAddress:ee:9d:22:14:c6:bb Speed:10000 Mtu:8900} {Name:df48884077bbd8f MacAddress:82:18:d9:3b:01:28 Speed:10000 Mtu:8900} {Name:e023598af626161 MacAddress:96:69:02:21:71:39 Speed:10000 Mtu:8900} {Name:e51dfab7b748272 MacAddress:9e:05:42:95:de:68 Speed:10000 Mtu:8900} {Name:e9eec45b7800670 MacAddress:8e:3b:eb:af:a6:9d Speed:10000 Mtu:8900} {Name:e9fad8077e3e386 MacAddress:ce:47:e8:af:5e:b7 Speed:10000 Mtu:8900} {Name:ee50d0908ef8218 MacAddress:de:9a:b9:4b:8f:d3 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:3a:b1:eb Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:76:56:79 Speed:-1 Mtu:9000} {Name:f330f5d17188d70 MacAddress:32:f2:81:94:e1:d6 Speed:10000 Mtu:8900} {Name:f94d3d5f4c829c1 MacAddress:42:a4:49:ee:e9:73 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:06:cd:ac:c4:de:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 03:13:18.055775 master-0 kubenswrapper[13046]: I0308 03:13:18.055296 13046 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 03:13:18.055775 master-0 kubenswrapper[13046]: I0308 03:13:18.055358 13046 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 03:13:18.055775 master-0 kubenswrapper[13046]: I0308 03:13:18.055583 13046 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 03:13:18.055775 master-0 kubenswrapper[13046]: I0308 03:13:18.055715 13046 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.055743 13046 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.055941 13046 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.055952 13046 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.055960 13046 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.055981 13046 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 03:13:18.056105 master-0 kubenswrapper[13046]: I0308 03:13:18.056013 13046 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056130 13046 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056185 13046 kubelet.go:418] "Attempting to sync node with API server" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056196 13046 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056210 13046 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056225 13046 kubelet.go:324] "Adding apiserver pod source" Mar 08 03:13:18.056274 master-0 kubenswrapper[13046]: I0308 03:13:18.056235 13046 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 03:13:18.057865 master-0 kubenswrapper[13046]: I0308 03:13:18.057503 13046 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 08 03:13:18.057937 master-0 kubenswrapper[13046]: I0308 03:13:18.057895 13046 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 03:13:18.058058 master-0 kubenswrapper[13046]: W0308 03:13:18.057989 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:18.058058 master-0 kubenswrapper[13046]: E0308 03:13:18.058055 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:18.058194 master-0 kubenswrapper[13046]: W0308 03:13:18.058082 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:18.058194 master-0 kubenswrapper[13046]: I0308 03:13:18.058178 13046 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 03:13:18.058277 master-0 kubenswrapper[13046]: E0308 03:13:18.058195 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:18.058370 master-0 kubenswrapper[13046]: I0308 03:13:18.058331 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 03:13:18.058370 master-0 kubenswrapper[13046]: I0308 03:13:18.058356 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 03:13:18.058370 master-0 kubenswrapper[13046]: I0308 03:13:18.058364 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 03:13:18.058370 master-0 kubenswrapper[13046]: I0308 03:13:18.058372 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058381 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058389 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058397 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058404 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058413 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058420 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058430 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058442 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 03:13:18.058575 master-0 kubenswrapper[13046]: I0308 03:13:18.058476 13046 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 03:13:18.058898 master-0 kubenswrapper[13046]: I0308 03:13:18.058874 13046 server.go:1280] "Started kubelet" Mar 08 03:13:18.060057 master-0 kubenswrapper[13046]: I0308 03:13:18.059749 13046 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 03:13:18.062621 master-0 kubenswrapper[13046]: I0308 03:13:18.060357 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:18.062621 master-0 kubenswrapper[13046]: I0308 03:13:18.061385 13046 server.go:449] "Adding debug handlers to kubelet server" Mar 08 03:13:18.060445 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 08 03:13:18.067813 master-0 kubenswrapper[13046]: I0308 03:13:18.059068 13046 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 03:13:18.067889 master-0 kubenswrapper[13046]: I0308 03:13:18.067845 13046 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 08 03:13:18.068412 master-0 kubenswrapper[13046]: I0308 03:13:18.068350 13046 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 03:13:18.070499 master-0 kubenswrapper[13046]: E0308 03:13:18.070298 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:13:18.079069 master-0 kubenswrapper[13046]: I0308 03:13:18.078093 13046 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 03:13:18.079069 master-0 kubenswrapper[13046]: I0308 03:13:18.078461 13046 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-09 03:01:34 +0000 UTC, rotation deadline is 2026-03-08 23:08:52.746753919 +0000 UTC Mar 08 03:13:18.079069 master-0 kubenswrapper[13046]: I0308 03:13:18.078538 13046 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h55m34.668218257s for next certificate rotation Mar 08 03:13:18.079069 master-0 kubenswrapper[13046]: I0308 03:13:18.078661 13046 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 03:13:18.079748 master-0 kubenswrapper[13046]: I0308 03:13:18.079309 13046 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 03:13:18.079748 master-0 kubenswrapper[13046]: I0308 03:13:18.079328 13046 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 03:13:18.079748 master-0 kubenswrapper[13046]: I0308 03:13:18.079433 13046 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 08 03:13:18.079748 master-0 kubenswrapper[13046]: E0308 03:13:18.079556 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:13:18.080753 master-0 kubenswrapper[13046]: I0308 03:13:18.080723 13046 factory.go:55] Registering systemd factory Mar 08 03:13:18.080753 master-0 kubenswrapper[13046]: I0308 03:13:18.080749 13046 factory.go:221] Registration of the systemd container factory successfully Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: I0308 03:13:18.081463 13046 factory.go:153] Registering CRI-O factory Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: I0308 03:13:18.081495 13046 factory.go:221] Registration of the crio container factory successfully Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: I0308 03:13:18.081562 13046 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: I0308 03:13:18.081584 13046 factory.go:103] Registering Raw factory Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: I0308 03:13:18.081602 13046 manager.go:1196] Started watching for new ooms in manager Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: W0308 03:13:18.081754 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: E0308 03:13:18.081823 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:18.081957 master-0 kubenswrapper[13046]: E0308 03:13:18.081815 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:13:18.082313 master-0 kubenswrapper[13046]: I0308 03:13:18.082215 13046 manager.go:319] Starting recovery of all containers Mar 08 03:13:18.096665 master-0 kubenswrapper[13046]: I0308 03:13:18.096574 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client" seLinuxMountContext="" Mar 08 03:13:18.096665 master-0 kubenswrapper[13046]: I0308 03:13:18.096655 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e81d3c37-e8d7-44c8-973e-13992380ce85" volumeName="kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096674 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53254b19-b5b3-4f97-bc64-37be8b2a41b7" volumeName="kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096691 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096705 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c31f7cee-d21d-4c23-af9b-1e0180b12e1e" volumeName="kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096718 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096731 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53254b19-b5b3-4f97-bc64-37be8b2a41b7" volumeName="kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096744 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" volumeName="kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096761 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096774 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bda3bd48-6de3-49b0-b2ce-96d97e97f178" volumeName="kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096788 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096803 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096819 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0781e6af-f5b5-40f7-bb7f-5bc6978b4957" volumeName="kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096853 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" volumeName="kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096864 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="caa3a50c-1291-4152-a48a-f7c7b49627db" volumeName="kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx" seLinuxMountContext="" Mar 08 03:13:18.096877 master-0 kubenswrapper[13046]: I0308 03:13:18.096887 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52836130-d42e-495c-adbf-19ff9a393347" volumeName="kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096905 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" volumeName="kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096921 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eedc7538-9cc6-4bf5-9628-e278310d796b" volumeName="kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096934 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096947 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096961 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9d0240-fc00-4d78-9458-8f53b1876f1b" volumeName="kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096973 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" volumeName="kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.096990 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1092f2a6-865c-4706-bba7-068621e85ebc" volumeName="kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097007 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1092f2a6-865c-4706-bba7-068621e85ebc" volumeName="kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097022 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097033 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3729e29-4c57-4f9b-8202-a87fd3a9a722" volumeName="kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097086 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68309159-130a-4ffa-acec-95dc4b795b8f" volumeName="kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097104 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097118 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="306b824f-dcfb-4e69-9a23-64dfbae61852" volumeName="kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097133 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097146 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097161 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097174 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097189 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e324f6c-ee4c-42bc-b241-9c6938749854" volumeName="kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097201 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097214 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="febf6a91-8b78-4b22-93b9-155cb7761fc4" volumeName="kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097226 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a6e3f01-0f22-4961-b450-56aca5477943" volumeName="kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097240 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4108f513-acef-473a-ab03-f3761b2bd0d8" volumeName="kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097252 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097264 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097278 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50ab8f71-42b8-4967-8a0b-016647c59a37" volumeName="kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097291 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097305 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097318 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de90d207-06d6-4778-b1b0-9020b1f2a881" volumeName="kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097383 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097427 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f822854-b9ac-46f2-b03b-e7215fba9208" volumeName="kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097442 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert" seLinuxMountContext="" Mar 08 03:13:18.097421 master-0 kubenswrapper[13046]: I0308 03:13:18.097455 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097468 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" volumeName="kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097536 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097555 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097567 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097586 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05d5093-20f4-42d5-9db3-811e049cc1b6" volumeName="kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097603 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bbe9b81-0efb-4caa-bacd-55348cd392c6" volumeName="kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097622 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097644 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" volumeName="kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097700 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097715 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" volumeName="kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097727 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097743 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097758 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097772 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fe33f926-9348-4498-a892-d2becaeecc14" volumeName="kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097786 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="234638fe-5577-45bc-9094-907c5611da38" volumeName="kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097799 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097843 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3729e29-4c57-4f9b-8202-a87fd3a9a722" volumeName="kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097858 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e324f6c-ee4c-42bc-b241-9c6938749854" volumeName="kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097873 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097886 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097898 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097909 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fe33f926-9348-4498-a892-d2becaeecc14" volumeName="kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097923 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1092f2a6-865c-4706-bba7-068621e85ebc" volumeName="kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097935 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af653e87-ce5f-4f1a-a20d-233c563694ba" volumeName="kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097947 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097959 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05d5093-20f4-42d5-9db3-811e049cc1b6" volumeName="kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097972 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c31f7cee-d21d-4c23-af9b-1e0180b12e1e" volumeName="kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.097990 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982ea338-c7be-4776-9bb7-113834c54aaa" volumeName="kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098008 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098023 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f822854-b9ac-46f2-b03b-e7215fba9208" volumeName="kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098034 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098046 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098058 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098072 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098094 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="234638fe-5577-45bc-9094-907c5611da38" volumeName="kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098107 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="23b66415-df37-4015-9a0c-69115b3a0739" volumeName="kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098120 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" volumeName="kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098131 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098142 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098155 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098168 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098198 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="275be8d3-df30-46f7-9d0a-806e404dfd57" volumeName="kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098214 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098230 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098242 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba9496ed-060e-4118-9da6-89b82bd49263" volumeName="kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098255 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098269 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a6e3f01-0f22-4961-b450-56aca5477943" volumeName="kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098282 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4108f513-acef-473a-ab03-f3761b2bd0d8" volumeName="kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098295 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e324f6c-ee4c-42bc-b241-9c6938749854" volumeName="kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098307 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aadbbe97-2a03-40da-846d-252e29661f67" volumeName="kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098320 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" volumeName="kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098333 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a6e3f01-0f22-4961-b450-56aca5477943" volumeName="kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098345 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9cf6ce1a-c203-4033-86be-be16694a9062" volumeName="kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098359 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098373 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52836130-d42e-495c-adbf-19ff9a393347" volumeName="kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098386 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" volumeName="kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098407 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" volumeName="kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098421 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bda3bd48-6de3-49b0-b2ce-96d97e97f178" volumeName="kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098435 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52836130-d42e-495c-adbf-19ff9a393347" volumeName="kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098449 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af653e87-ce5f-4f1a-a20d-233c563694ba" volumeName="kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098461 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="febf6a91-8b78-4b22-93b9-155cb7761fc4" volumeName="kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098474 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" volumeName="kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098505 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76ceb013-e999-4f15-bf25-f8dcd2647f9f" volumeName="kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098520 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9cf6ce1a-c203-4033-86be-be16694a9062" volumeName="kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098534 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af653e87-ce5f-4f1a-a20d-233c563694ba" volumeName="kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098547 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05d5093-20f4-42d5-9db3-811e049cc1b6" volumeName="kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098561 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bda3bd48-6de3-49b0-b2ce-96d97e97f178" volumeName="kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098574 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" volumeName="kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098660 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098673 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098687 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f99f81a-fd2d-432e-a3bc-e451342650b1" volumeName="kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098698 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098712 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098725 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de90d207-06d6-4778-b1b0-9020b1f2a881" volumeName="kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098737 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2dc664e3-7f37-4fba-8104-544ffb18c1bd" volumeName="kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098753 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50ab8f71-42b8-4967-8a0b-016647c59a37" volumeName="kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098764 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" volumeName="kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098777 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d83aa242-606f-4adc-b689-4aa89625b533" volumeName="kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098788 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3729e29-4c57-4f9b-8202-a87fd3a9a722" volumeName="kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098801 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9d0240-fc00-4d78-9458-8f53b1876f1b" volumeName="kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098814 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3178dfc0-a35e-418e-a954-cd919b8af88c" volumeName="kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098825 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098837 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098849 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53254b19-b5b3-4f97-bc64-37be8b2a41b7" volumeName="kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098861 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53254b19-b5b3-4f97-bc64-37be8b2a41b7" volumeName="kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098873 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098885 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f99f81a-fd2d-432e-a3bc-e451342650b1" volumeName="kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098900 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb74744-fb99-4663-a7d0-7bae2db205e9" volumeName="kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098916 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eedc7538-9cc6-4bf5-9628-e278310d796b" volumeName="kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098937 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="234638fe-5577-45bc-9094-907c5611da38" volumeName="kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098955 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098970 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" volumeName="kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.098988 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9949f9f4-00f3-4ac8-b8a2-a9549693f5b1" volumeName="kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.099004 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9f377bf-79c5-4425-b5d1-256961835f62" volumeName="kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.099021 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="febf6a91-8b78-4b22-93b9-155cb7761fc4" volumeName="kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert" seLinuxMountContext="" Mar 08 03:13:18.098898 master-0 kubenswrapper[13046]: I0308 03:13:18.099039 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="23b66415-df37-4015-9a0c-69115b3a0739" volumeName="kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099055 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099067 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c" volumeName="kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099078 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099094 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a2c9576-f7bd-4ac5-a7fe-530f26642f97" volumeName="kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099110 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9f377bf-79c5-4425-b5d1-256961835f62" volumeName="kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099126 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fe33f926-9348-4498-a892-d2becaeecc14" volumeName="kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099142 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099156 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" volumeName="kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099168 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33c15b06-a21e-411f-b324-3ae0c7f0e9a4" volumeName="kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099181 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" volumeName="kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099197 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9cf6ce1a-c203-4033-86be-be16694a9062" volumeName="kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099215 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099230 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3729e29-4c57-4f9b-8202-a87fd3a9a722" volumeName="kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099246 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf93333-b537-4f23-9c77-6a245b290fe3" volumeName="kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099260 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099274 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099291 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7ea81472-8a81-45ec-a07d-8710f47a927d" volumeName="kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099305 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099320 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" volumeName="kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099336 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b83ab56c-e28d-4e82-ae8f-92649a1448ed" volumeName="kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099350 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099365 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" volumeName="kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099381 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099396 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6432d23b-a55a-4131-83d5-5f16419809dd" volumeName="kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099410 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="caa3a50c-1291-4152-a48a-f7c7b49627db" volumeName="kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099424 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e71caa06-6ce7-47c9-a267-21f6b6af9247" volumeName="kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099436 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fe33f926-9348-4498-a892-d2becaeecc14" volumeName="kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099449 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bbe9b81-0efb-4caa-bacd-55348cd392c6" volumeName="kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099512 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="52836130-d42e-495c-adbf-19ff9a393347" volumeName="kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099530 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70fba73e-c201-4866-bc69-64892ea5bdca" volumeName="kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099542 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="caa3a50c-1291-4152-a48a-f7c7b49627db" volumeName="kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099555 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d83aa242-606f-4adc-b689-4aa89625b533" volumeName="kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099567 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfe0357f-dab4-4424-869c-f6070b411a35" volumeName="kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099580 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e74c8bb2-e063-4b60-b3fe-651aa534d029" volumeName="kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099591 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eedc7538-9cc6-4bf5-9628-e278310d796b" volumeName="kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099616 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4108f513-acef-473a-ab03-f3761b2bd0d8" volumeName="kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099630 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" volumeName="kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099641 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099655 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" volumeName="kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099669 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="234638fe-5577-45bc-9094-907c5611da38" volumeName="kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099681 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" volumeName="kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099693 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a08ddb-1045-4631-ba52-93f3046ebd0a" volumeName="kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099705 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099722 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e569889-4759-4046-b0ed-e550078521c6" volumeName="kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099735 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" volumeName="kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099750 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9d0240-fc00-4d78-9458-8f53b1876f1b" volumeName="kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099763 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c9f377bf-79c5-4425-b5d1-256961835f62" volumeName="kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099775 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de90d207-06d6-4778-b1b0-9020b1f2a881" volumeName="kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099786 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099798 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="febf6a91-8b78-4b22-93b9-155cb7761fc4" volumeName="kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099811 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50ab8f71-42b8-4967-8a0b-016647c59a37" volumeName="kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099824 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68309159-130a-4ffa-acec-95dc4b795b8f" volumeName="kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099837 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c0192f3-2e60-42c6-9836-c70a9fa407d5" volumeName="kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099848 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099867 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e569889-4759-4046-b0ed-e550078521c6" volumeName="kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099880 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" volumeName="kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099892 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982ea338-c7be-4776-9bb7-113834c54aaa" volumeName="kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099906 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd6b827c-70b0-47ed-b07c-c696343248a8" volumeName="kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099920 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a2c9576-f7bd-4ac5-a7fe-530f26642f97" volumeName="kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099934 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68309159-130a-4ffa-acec-95dc4b795b8f" volumeName="kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099945 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa781f72-e72f-47e1-b37a-977340c182c8" volumeName="kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099957 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" volumeName="kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099969 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" volumeName="kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.099984 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99d6808-9fec-402d-93f7-41575a5a0a08" volumeName="kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100005 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="275be8d3-df30-46f7-9d0a-806e404dfd57" volumeName="kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100022 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dac2b210-2fbb-4d25-a0ea-1825259cee3b" volumeName="kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100036 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33c15b06-a21e-411f-b324-3ae0c7f0e9a4" volumeName="kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100051 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d358134e-5625-492c-b4f7-460798631270" volumeName="kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100065 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" volumeName="kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100080 13046 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d23557f-6bb1-46ce-a56e-d0011c576125" volumeName="kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp" seLinuxMountContext="" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100092 13046 reconstruct.go:97] "Volume reconstruction finished" Mar 08 03:13:18.102307 master-0 kubenswrapper[13046]: I0308 03:13:18.100103 13046 reconciler.go:26] "Reconciler: start to sync state" Mar 08 03:13:18.115651 master-0 kubenswrapper[13046]: I0308 03:13:18.115577 13046 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 03:13:18.117034 master-0 kubenswrapper[13046]: I0308 03:13:18.117009 13046 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 03:13:18.117105 master-0 kubenswrapper[13046]: I0308 03:13:18.117057 13046 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 03:13:18.117105 master-0 kubenswrapper[13046]: I0308 03:13:18.117099 13046 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 03:13:18.117220 master-0 kubenswrapper[13046]: E0308 03:13:18.117156 13046 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 03:13:18.119029 master-0 kubenswrapper[13046]: W0308 03:13:18.118958 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:18.119078 master-0 kubenswrapper[13046]: E0308 03:13:18.119044 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:18.125029 master-0 kubenswrapper[13046]: I0308 03:13:18.124975 13046 generic.go:334] "Generic (PLEG): container finished" podID="3178dfc0-a35e-418e-a954-cd919b8af88c" containerID="a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0" exitCode=0 Mar 08 03:13:18.128685 master-0 kubenswrapper[13046]: I0308 03:13:18.128633 13046 generic.go:334] "Generic (PLEG): container finished" podID="dac2b210-2fbb-4d25-a0ea-1825259cee3b" containerID="00745b525b8dc575694c9918d1d6d5efb6d0a2ce9d5450f6f2b8a338bdae4d1c" exitCode=0 Mar 08 03:13:18.144255 master-0 kubenswrapper[13046]: I0308 03:13:18.144211 13046 generic.go:334] "Generic (PLEG): container finished" podID="1a6e3f01-0f22-4961-b450-56aca5477943" containerID="efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8" exitCode=0 Mar 08 03:13:18.152649 master-0 kubenswrapper[13046]: I0308 03:13:18.152598 13046 generic.go:334] "Generic (PLEG): container finished" podID="6432d23b-a55a-4131-83d5-5f16419809dd" containerID="d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a" exitCode=0 Mar 08 03:13:18.154940 master-0 kubenswrapper[13046]: I0308 03:13:18.154906 13046 generic.go:334] "Generic (PLEG): container finished" podID="b05d5093-20f4-42d5-9db3-811e049cc1b6" containerID="9dcf81635de6906146c147e78ec6bda20f98dd55e53a8e7eb4bd3270e962f41e" exitCode=0 Mar 08 03:13:18.160636 master-0 kubenswrapper[13046]: I0308 03:13:18.160599 13046 generic.go:334] "Generic (PLEG): container finished" podID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerID="6f47524787fe6d12f2f00918cc138535f7c801d780aa325200500bc9264d2c6c" exitCode=0 Mar 08 03:13:18.165075 master-0 kubenswrapper[13046]: I0308 03:13:18.164972 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_e81d3c37-e8d7-44c8-973e-13992380ce85/installer/0.log" Mar 08 03:13:18.165165 master-0 kubenswrapper[13046]: I0308 03:13:18.165127 13046 generic.go:334] "Generic (PLEG): container finished" podID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerID="d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e" exitCode=1 Mar 08 03:13:18.177640 master-0 kubenswrapper[13046]: I0308 03:13:18.177559 13046 generic.go:334] "Generic (PLEG): container finished" podID="982ea338-c7be-4776-9bb7-113834c54aaa" containerID="5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56" exitCode=0 Mar 08 03:13:18.179653 master-0 kubenswrapper[13046]: E0308 03:13:18.179622 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:13:18.180236 master-0 kubenswrapper[13046]: I0308 03:13:18.180196 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/0.log" Mar 08 03:13:18.180321 master-0 kubenswrapper[13046]: I0308 03:13:18.180250 13046 generic.go:334] "Generic (PLEG): container finished" podID="70fba73e-c201-4866-bc69-64892ea5bdca" containerID="7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e" exitCode=1 Mar 08 03:13:18.188004 master-0 kubenswrapper[13046]: I0308 03:13:18.187934 13046 generic.go:334] "Generic (PLEG): container finished" podID="9cf6ce1a-c203-4033-86be-be16694a9062" containerID="ccf5656bb56a19c7a22e492a44ae1446dc7c5b94a77f84f22b258b7af6805d2a" exitCode=0 Mar 08 03:13:18.189826 master-0 kubenswrapper[13046]: I0308 03:13:18.189793 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 08 03:13:18.190238 master-0 kubenswrapper[13046]: I0308 03:13:18.190129 13046 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a" exitCode=1 Mar 08 03:13:18.190238 master-0 kubenswrapper[13046]: I0308 03:13:18.190227 13046 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755" exitCode=0 Mar 08 03:13:18.196375 master-0 kubenswrapper[13046]: I0308 03:13:18.196333 13046 generic.go:334] "Generic (PLEG): container finished" podID="cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd" containerID="a24cd319d12c3bab1bf2b10e5afaf7c8e507dee6da981a24060116593e6e64aa" exitCode=0 Mar 08 03:13:18.209956 master-0 kubenswrapper[13046]: I0308 03:13:18.209914 13046 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0240-fc00-4d78-9458-8f53b1876f1b" containerID="f9107d4fe9e5fd8ff1dc1c072f33a8b790e39886ea3bd32d4664e530799cf713" exitCode=0 Mar 08 03:13:18.215769 master-0 kubenswrapper[13046]: I0308 03:13:18.215732 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="c1d3c31c196416ae00334f18b3e579542658be979ab39e41ffb430f787c5ee3e" exitCode=0 Mar 08 03:13:18.215861 master-0 kubenswrapper[13046]: I0308 03:13:18.215825 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="b3ea93aa98c6a855a072d3642fcdd00f5f7951231e2c2010a477ac7e3afcf009" exitCode=0 Mar 08 03:13:18.215861 master-0 kubenswrapper[13046]: I0308 03:13:18.215850 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="bf04118919c009f59ea3e84f16c295d8440cef4db850135663e4a2db1d87ef48" exitCode=0 Mar 08 03:13:18.215861 master-0 kubenswrapper[13046]: I0308 03:13:18.215859 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="db3a63a925785d6eff81f565afee5497f9a99d04d1c84187c3150ffb13b3defd" exitCode=0 Mar 08 03:13:18.215948 master-0 kubenswrapper[13046]: I0308 03:13:18.215869 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="2adae51f407e15a120ce855a4a69d4bbd243779881704875d67dd256bba0227a" exitCode=0 Mar 08 03:13:18.215948 master-0 kubenswrapper[13046]: I0308 03:13:18.215880 13046 generic.go:334] "Generic (PLEG): container finished" podID="76ceb013-e999-4f15-bf25-f8dcd2647f9f" containerID="4aa5fc291dd0b6e7ec288140975372ce39389e86edf66a268784556c20872aa9" exitCode=0 Mar 08 03:13:18.217662 master-0 kubenswrapper[13046]: E0308 03:13:18.217638 13046 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:13:18.219984 master-0 kubenswrapper[13046]: I0308 03:13:18.219920 13046 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="2ecfec74d59cc3d0b000968048ba4bb7931b60227ed12aaa53445141ec092ff9" exitCode=0 Mar 08 03:13:18.226625 master-0 kubenswrapper[13046]: I0308 03:13:18.226591 13046 generic.go:334] "Generic (PLEG): container finished" podID="7e324f6c-ee4c-42bc-b241-9c6938749854" containerID="f23bd786497d6c307edb85e8d774c9b8f2223af0ca9dc43c45c0639c00c00251" exitCode=0 Mar 08 03:13:18.404758 master-0 kubenswrapper[13046]: E0308 03:13:18.404594 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:13:18.407348 master-0 kubenswrapper[13046]: E0308 03:13:18.407269 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:13:18.418042 master-0 kubenswrapper[13046]: E0308 03:13:18.418011 13046 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 03:13:18.423114 master-0 kubenswrapper[13046]: I0308 03:13:18.423062 13046 generic.go:334] "Generic (PLEG): container finished" podID="68309159-130a-4ffa-acec-95dc4b795b8f" containerID="c9f7a2553aef09408038ebc72fb8e56d18eb9a842f8d18ad116a3d6714abc2f9" exitCode=0 Mar 08 03:13:18.430579 master-0 kubenswrapper[13046]: I0308 03:13:18.430455 13046 generic.go:334] "Generic (PLEG): container finished" podID="50ab8f71-42b8-4967-8a0b-016647c59a37" containerID="e1d9e093cba9edf2b9fe5ff93e3ebb84d76e14c7ae92e011cb61c2ecdf53de26" exitCode=0 Mar 08 03:13:18.440049 master-0 kubenswrapper[13046]: I0308 03:13:18.438991 13046 generic.go:334] "Generic (PLEG): container finished" podID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerID="00d06648e335af10bd876c293f9902417ade2722b0f152f68b636aa5a6ef0592" exitCode=0 Mar 08 03:13:18.442530 master-0 kubenswrapper[13046]: I0308 03:13:18.442451 13046 generic.go:334] "Generic (PLEG): container finished" podID="f99d6808-9fec-402d-93f7-41575a5a0a08" containerID="ff016ecc0b1406f7273a88aa6d6e16959d56705496418cbd4648458844f2bbb1" exitCode=0 Mar 08 03:13:18.448162 master-0 kubenswrapper[13046]: I0308 03:13:18.448050 13046 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf" exitCode=0 Mar 08 03:13:18.448162 master-0 kubenswrapper[13046]: I0308 03:13:18.448158 13046 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="c01db717080df3353657720c6a107d2d2fc41a3dd4edfcbd6e02a08696eb5639" exitCode=0 Mar 08 03:13:18.448274 master-0 kubenswrapper[13046]: I0308 03:13:18.448167 13046 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="d7c267c7d1ad40b10c4f9d19008802c751a1cdd3364f0744ee013a61bcad5ca6" exitCode=0 Mar 08 03:13:18.454837 master-0 kubenswrapper[13046]: I0308 03:13:18.454800 13046 generic.go:334] "Generic (PLEG): container finished" podID="7ea81472-8a81-45ec-a07d-8710f47a927d" containerID="d57e1157c7569d934ea76665ae63811243fb6a6eb902e18c216d3947853ca6e4" exitCode=0 Mar 08 03:13:18.466043 master-0 kubenswrapper[13046]: I0308 03:13:18.465841 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb" exitCode=0 Mar 08 03:13:18.466244 master-0 kubenswrapper[13046]: I0308 03:13:18.466098 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619" exitCode=0 Mar 08 03:13:18.481003 master-0 kubenswrapper[13046]: I0308 03:13:18.480929 13046 generic.go:334] "Generic (PLEG): container finished" podID="aadbbe97-2a03-40da-846d-252e29661f67" containerID="dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62" exitCode=0 Mar 08 03:13:18.485258 master-0 kubenswrapper[13046]: I0308 03:13:18.485212 13046 generic.go:334] "Generic (PLEG): container finished" podID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" containerID="1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22" exitCode=0 Mar 08 03:13:18.498313 master-0 kubenswrapper[13046]: I0308 03:13:18.497875 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42" exitCode=1 Mar 08 03:13:18.504993 master-0 kubenswrapper[13046]: I0308 03:13:18.504643 13046 generic.go:334] "Generic (PLEG): container finished" podID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" containerID="d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386" exitCode=0 Mar 08 03:13:18.504993 master-0 kubenswrapper[13046]: E0308 03:13:18.504752 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:13:18.506533 master-0 kubenswrapper[13046]: I0308 03:13:18.506502 13046 generic.go:334] "Generic (PLEG): container finished" podID="8c0192f3-2e60-42c6-9836-c70a9fa407d5" containerID="2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9" exitCode=0 Mar 08 03:13:18.510226 master-0 kubenswrapper[13046]: I0308 03:13:18.510194 13046 generic.go:334] "Generic (PLEG): container finished" podID="e71caa06-6ce7-47c9-a267-21f6b6af9247" containerID="5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2" exitCode=0 Mar 08 03:13:18.512542 master-0 kubenswrapper[13046]: I0308 03:13:18.512347 13046 generic.go:334] "Generic (PLEG): container finished" podID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" containerID="ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f" exitCode=0 Mar 08 03:13:18.605018 master-0 kubenswrapper[13046]: E0308 03:13:18.604983 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:13:18.611975 master-0 kubenswrapper[13046]: I0308 03:13:18.611943 13046 manager.go:324] Recovery completed Mar 08 03:13:18.678628 master-0 kubenswrapper[13046]: I0308 03:13:18.678601 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.681241 master-0 kubenswrapper[13046]: I0308 03:13:18.681175 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.681325 master-0 kubenswrapper[13046]: I0308 03:13:18.681259 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.681325 master-0 kubenswrapper[13046]: I0308 03:13:18.681275 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.684165 master-0 kubenswrapper[13046]: I0308 03:13:18.684148 13046 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 03:13:18.684252 master-0 kubenswrapper[13046]: I0308 03:13:18.684239 13046 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 03:13:18.684315 master-0 kubenswrapper[13046]: I0308 03:13:18.684305 13046 state_mem.go:36] "Initialized new in-memory state store" Mar 08 03:13:18.684632 master-0 kubenswrapper[13046]: I0308 03:13:18.684614 13046 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 08 03:13:18.684713 master-0 kubenswrapper[13046]: I0308 03:13:18.684690 13046 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 08 03:13:18.684770 master-0 kubenswrapper[13046]: I0308 03:13:18.684761 13046 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 08 03:13:18.684821 master-0 kubenswrapper[13046]: I0308 03:13:18.684812 13046 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 08 03:13:18.684867 master-0 kubenswrapper[13046]: I0308 03:13:18.684859 13046 policy_none.go:49] "None policy: Start" Mar 08 03:13:18.687261 master-0 kubenswrapper[13046]: I0308 03:13:18.687229 13046 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 03:13:18.687311 master-0 kubenswrapper[13046]: I0308 03:13:18.687268 13046 state_mem.go:35] "Initializing new in-memory state store" Mar 08 03:13:18.687492 master-0 kubenswrapper[13046]: I0308 03:13:18.687459 13046 state_mem.go:75] "Updated machine memory state" Mar 08 03:13:18.687492 master-0 kubenswrapper[13046]: I0308 03:13:18.687474 13046 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 08 03:13:18.704251 master-0 kubenswrapper[13046]: I0308 03:13:18.704224 13046 manager.go:334] "Starting Device Plugin manager" Mar 08 03:13:18.704338 master-0 kubenswrapper[13046]: I0308 03:13:18.704264 13046 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 03:13:18.704338 master-0 kubenswrapper[13046]: I0308 03:13:18.704275 13046 server.go:79] "Starting device plugin registration server" Mar 08 03:13:18.704579 master-0 kubenswrapper[13046]: I0308 03:13:18.704565 13046 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 03:13:18.704626 master-0 kubenswrapper[13046]: I0308 03:13:18.704580 13046 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 03:13:18.706028 master-0 kubenswrapper[13046]: I0308 03:13:18.706002 13046 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 03:13:18.706127 master-0 kubenswrapper[13046]: I0308 03:13:18.706110 13046 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 03:13:18.706127 master-0 kubenswrapper[13046]: I0308 03:13:18.706122 13046 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 03:13:18.718049 master-0 kubenswrapper[13046]: E0308 03:13:18.718014 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:13:18.804804 master-0 kubenswrapper[13046]: I0308 03:13:18.804695 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.806948 master-0 kubenswrapper[13046]: I0308 03:13:18.806923 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.806948 master-0 kubenswrapper[13046]: I0308 03:13:18.806949 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.807035 master-0 kubenswrapper[13046]: I0308 03:13:18.806957 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.807035 master-0 kubenswrapper[13046]: I0308 03:13:18.806974 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:18.807713 master-0 kubenswrapper[13046]: E0308 03:13:18.807682 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:18.807887 master-0 kubenswrapper[13046]: E0308 03:13:18.807839 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:13:18.818567 master-0 kubenswrapper[13046]: I0308 03:13:18.818524 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:13:18.818656 master-0 kubenswrapper[13046]: I0308 03:13:18.818600 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.820509 master-0 kubenswrapper[13046]: I0308 03:13:18.820476 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.820569 master-0 kubenswrapper[13046]: I0308 03:13:18.820522 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.820569 master-0 kubenswrapper[13046]: I0308 03:13:18.820532 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.820648 master-0 kubenswrapper[13046]: I0308 03:13:18.820631 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.820864 master-0 kubenswrapper[13046]: I0308 03:13:18.820837 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.823097 master-0 kubenswrapper[13046]: I0308 03:13:18.823068 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.823146 master-0 kubenswrapper[13046]: I0308 03:13:18.823106 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.823146 master-0 kubenswrapper[13046]: I0308 03:13:18.823124 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.823235 master-0 kubenswrapper[13046]: I0308 03:13:18.823213 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.823414 master-0 kubenswrapper[13046]: I0308 03:13:18.823390 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.823414 master-0 kubenswrapper[13046]: I0308 03:13:18.823413 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.823514 master-0 kubenswrapper[13046]: I0308 03:13:18.823422 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.823856 master-0 kubenswrapper[13046]: I0308 03:13:18.823831 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.824786 master-0 kubenswrapper[13046]: I0308 03:13:18.824765 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.824786 master-0 kubenswrapper[13046]: I0308 03:13:18.824786 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.824864 master-0 kubenswrapper[13046]: I0308 03:13:18.824794 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.824864 master-0 kubenswrapper[13046]: I0308 03:13:18.824850 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.825132 master-0 kubenswrapper[13046]: I0308 03:13:18.825115 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.825767 master-0 kubenswrapper[13046]: I0308 03:13:18.825747 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.825809 master-0 kubenswrapper[13046]: I0308 03:13:18.825771 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.825809 master-0 kubenswrapper[13046]: I0308 03:13:18.825794 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.831809 master-0 kubenswrapper[13046]: I0308 03:13:18.831775 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.831871 master-0 kubenswrapper[13046]: I0308 03:13:18.831843 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.831871 master-0 kubenswrapper[13046]: I0308 03:13:18.831866 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.833042 master-0 kubenswrapper[13046]: I0308 03:13:18.833009 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.833042 master-0 kubenswrapper[13046]: I0308 03:13:18.833040 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.833120 master-0 kubenswrapper[13046]: I0308 03:13:18.833050 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.833193 master-0 kubenswrapper[13046]: I0308 03:13:18.833176 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.834276 master-0 kubenswrapper[13046]: I0308 03:13:18.834233 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.836167 master-0 kubenswrapper[13046]: I0308 03:13:18.836129 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.836327 master-0 kubenswrapper[13046]: I0308 03:13:18.836296 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.836378 master-0 kubenswrapper[13046]: I0308 03:13:18.836332 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.836589 master-0 kubenswrapper[13046]: I0308 03:13:18.836559 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.836742 master-0 kubenswrapper[13046]: I0308 03:13:18.836710 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:18.836790 master-0 kubenswrapper[13046]: I0308 03:13:18.836769 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.841447 master-0 kubenswrapper[13046]: I0308 03:13:18.841410 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.841447 master-0 kubenswrapper[13046]: I0308 03:13:18.841447 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.841586 master-0 kubenswrapper[13046]: I0308 03:13:18.841456 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.841586 master-0 kubenswrapper[13046]: I0308 03:13:18.841541 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.841586 master-0 kubenswrapper[13046]: I0308 03:13:18.841561 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.841677 master-0 kubenswrapper[13046]: I0308 03:13:18.841595 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.841677 master-0 kubenswrapper[13046]: I0308 03:13:18.841612 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.841785 master-0 kubenswrapper[13046]: I0308 03:13:18.841612 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.841824 master-0 kubenswrapper[13046]: I0308 03:13:18.841785 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.841910 master-0 kubenswrapper[13046]: I0308 03:13:18.841882 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5d0055fe86ac5ea9ab079fff2e0abe8e3b575553100200612d0622ec310e85" Mar 08 03:13:18.841945 master-0 kubenswrapper[13046]: I0308 03:13:18.841921 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08cfce465e0a061ec50a36c70cc4aab986b0fff0117392402f5475228663b20a" Mar 08 03:13:18.841976 master-0 kubenswrapper[13046]: I0308 03:13:18.841942 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.841976 master-0 kubenswrapper[13046]: I0308 03:13:18.841965 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:18.842070 master-0 kubenswrapper[13046]: I0308 03:13:18.841992 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"6f51a4db85d18d82603b426a557c9c6da1c85541f85af4f912c744b7f3a66c18"} Mar 08 03:13:18.842101 master-0 kubenswrapper[13046]: I0308 03:13:18.842077 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"dc9e74046c3486145e3e154ad4e0717958ffe525262b3d699c7541c237f7c61a"} Mar 08 03:13:18.842135 master-0 kubenswrapper[13046]: I0308 03:13:18.842101 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"a319c730809ca26c3a85e3da2618b48d9d17632d0a08b9cfde4f3e18505c5755"} Mar 08 03:13:18.842135 master-0 kubenswrapper[13046]: I0308 03:13:18.842127 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"10a7036769949c8fb65f4137dcc72de5165f2b95b9d6ff0a9f3ccce5f65dada8"} Mar 08 03:13:18.842192 master-0 kubenswrapper[13046]: I0308 03:13:18.842179 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1"} Mar 08 03:13:18.842221 master-0 kubenswrapper[13046]: I0308 03:13:18.842198 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044"} Mar 08 03:13:18.842252 master-0 kubenswrapper[13046]: I0308 03:13:18.842217 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"d55c2d192f202b920649085c7531e76c4e51cd3dc541693c6c74ec752e4a4f74"} Mar 08 03:13:18.842346 master-0 kubenswrapper[13046]: I0308 03:13:18.842321 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f96c717487c8fec762c20667ad24a9394c9f9417d53c2f93b4dfcdc28227e714" Mar 08 03:13:18.842411 master-0 kubenswrapper[13046]: I0308 03:13:18.842391 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1794eb09196873bf427659d432b54b129f9a36a9e463dc4369182d53770da8f" Mar 08 03:13:18.842450 master-0 kubenswrapper[13046]: I0308 03:13:18.842421 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459"} Mar 08 03:13:18.842450 master-0 kubenswrapper[13046]: I0308 03:13:18.842440 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778"} Mar 08 03:13:18.842585 master-0 kubenswrapper[13046]: I0308 03:13:18.842559 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a"} Mar 08 03:13:18.842585 master-0 kubenswrapper[13046]: I0308 03:13:18.842578 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca"} Mar 08 03:13:18.842636 master-0 kubenswrapper[13046]: I0308 03:13:18.842597 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42"} Mar 08 03:13:18.842636 master-0 kubenswrapper[13046]: I0308 03:13:18.842617 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0"} Mar 08 03:13:18.843845 master-0 kubenswrapper[13046]: I0308 03:13:18.843808 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:18.843895 master-0 kubenswrapper[13046]: I0308 03:13:18.843850 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:18.843895 master-0 kubenswrapper[13046]: I0308 03:13:18.843862 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:18.917450 master-0 kubenswrapper[13046]: I0308 03:13:18.917397 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:18.917450 master-0 kubenswrapper[13046]: I0308 03:13:18.917444 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917506 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917535 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917554 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917568 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917590 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917608 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.917668 master-0 kubenswrapper[13046]: I0308 03:13:18.917652 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.917853 master-0 kubenswrapper[13046]: I0308 03:13:18.917684 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:18.917853 master-0 kubenswrapper[13046]: I0308 03:13:18.917705 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:18.917853 master-0 kubenswrapper[13046]: I0308 03:13:18.917732 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:18.917853 master-0 kubenswrapper[13046]: I0308 03:13:18.917753 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:18.917853 master-0 kubenswrapper[13046]: I0308 03:13:18.917773 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:18.917980 master-0 kubenswrapper[13046]: I0308 03:13:18.917838 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:18.917980 master-0 kubenswrapper[13046]: I0308 03:13:18.917902 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:18.917980 master-0 kubenswrapper[13046]: I0308 03:13:18.917925 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:18.917980 master-0 kubenswrapper[13046]: I0308 03:13:18.917941 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:18.918086 master-0 kubenswrapper[13046]: I0308 03:13:18.918011 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:19.008426 master-0 kubenswrapper[13046]: I0308 03:13:19.008353 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.011893 master-0 kubenswrapper[13046]: I0308 03:13:19.011853 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.011954 master-0 kubenswrapper[13046]: I0308 03:13:19.011898 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.011954 master-0 kubenswrapper[13046]: I0308 03:13:19.011911 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.011954 master-0 kubenswrapper[13046]: I0308 03:13:19.011935 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:19.013081 master-0 kubenswrapper[13046]: E0308 03:13:19.012995 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:19.019114 master-0 kubenswrapper[13046]: I0308 03:13:19.019073 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:19.019114 master-0 kubenswrapper[13046]: I0308 03:13:19.019111 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019210 master-0 kubenswrapper[13046]: I0308 03:13:19.019138 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019210 master-0 kubenswrapper[13046]: I0308 03:13:19.019162 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019300 master-0 kubenswrapper[13046]: I0308 03:13:19.019215 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019300 master-0 kubenswrapper[13046]: I0308 03:13:19.019219 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019300 master-0 kubenswrapper[13046]: I0308 03:13:19.019279 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019300 master-0 kubenswrapper[13046]: I0308 03:13:19.019290 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:19.019300 master-0 kubenswrapper[13046]: I0308 03:13:19.019302 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019280 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019332 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019345 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019372 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019393 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019344 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019448 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019495 master-0 kubenswrapper[13046]: I0308 03:13:19.019473 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019513 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019534 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019539 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019574 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019597 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019619 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019640 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019662 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019722 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019731 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019742 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019762 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.019782 master-0 kubenswrapper[13046]: I0308 03:13:19.019790 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019778 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019827 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019837 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019755 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019794 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019880 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019918 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.020149 master-0 kubenswrapper[13046]: I0308 03:13:19.019943 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.062281 master-0 kubenswrapper[13046]: I0308 03:13:19.062150 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:19.101878 master-0 kubenswrapper[13046]: W0308 03:13:19.101776 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:19.102025 master-0 kubenswrapper[13046]: E0308 03:13:19.101884 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:19.142633 master-0 kubenswrapper[13046]: I0308 03:13:19.142554 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:13:19.144859 master-0 kubenswrapper[13046]: I0308 03:13:19.144809 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:13:19.177653 master-0 kubenswrapper[13046]: I0308 03:13:19.177449 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.189159 master-0 kubenswrapper[13046]: W0308 03:13:19.189104 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:19.189246 master-0 kubenswrapper[13046]: E0308 03:13:19.189204 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:19.413824 master-0 kubenswrapper[13046]: I0308 03:13:19.413710 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.418219 master-0 kubenswrapper[13046]: I0308 03:13:19.417818 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.418219 master-0 kubenswrapper[13046]: I0308 03:13:19.417871 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.418219 master-0 kubenswrapper[13046]: I0308 03:13:19.417885 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.418219 master-0 kubenswrapper[13046]: I0308 03:13:19.417913 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:19.418748 master-0 kubenswrapper[13046]: E0308 03:13:19.418661 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:19.518006 master-0 kubenswrapper[13046]: I0308 03:13:19.517937 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.518006 master-0 kubenswrapper[13046]: I0308 03:13:19.517978 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.518318 master-0 kubenswrapper[13046]: I0308 03:13:19.518048 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.519477 master-0 kubenswrapper[13046]: I0308 03:13:19.517950 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:19.522653 master-0 kubenswrapper[13046]: I0308 03:13:19.522610 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.522725 master-0 kubenswrapper[13046]: I0308 03:13:19.522655 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.522725 master-0 kubenswrapper[13046]: I0308 03:13:19.522676 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.522725 master-0 kubenswrapper[13046]: I0308 03:13:19.522687 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.522725 master-0 kubenswrapper[13046]: I0308 03:13:19.522712 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.522853 master-0 kubenswrapper[13046]: I0308 03:13:19.522743 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.522853 master-0 kubenswrapper[13046]: I0308 03:13:19.522754 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.522853 master-0 kubenswrapper[13046]: I0308 03:13:19.522650 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.522951 master-0 kubenswrapper[13046]: I0308 03:13:19.522878 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.525088 master-0 kubenswrapper[13046]: W0308 03:13:19.525028 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:19.525213 master-0 kubenswrapper[13046]: E0308 03:13:19.525105 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:19.530926 master-0 kubenswrapper[13046]: I0308 03:13:19.530828 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:19.531152 master-0 kubenswrapper[13046]: I0308 03:13:19.530957 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:19.531152 master-0 kubenswrapper[13046]: I0308 03:13:19.530975 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:19.608822 master-0 kubenswrapper[13046]: E0308 03:13:19.608759 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:13:19.629781 master-0 kubenswrapper[13046]: I0308 03:13:19.629610 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:19.701662 master-0 kubenswrapper[13046]: W0308 03:13:19.701507 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:19.701662 master-0 kubenswrapper[13046]: E0308 03:13:19.701606 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:20.062038 master-0 kubenswrapper[13046]: I0308 03:13:20.061900 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:20.219624 master-0 kubenswrapper[13046]: I0308 03:13:20.219576 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:20.222043 master-0 kubenswrapper[13046]: I0308 03:13:20.221972 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:20.222095 master-0 kubenswrapper[13046]: I0308 03:13:20.222052 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:20.222095 master-0 kubenswrapper[13046]: I0308 03:13:20.222064 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:20.222095 master-0 kubenswrapper[13046]: I0308 03:13:20.222088 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:20.223146 master-0 kubenswrapper[13046]: E0308 03:13:20.223095 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:20.490075 master-0 kubenswrapper[13046]: I0308 03:13:20.490038 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:20.525244 master-0 kubenswrapper[13046]: I0308 03:13:20.525193 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="66dcb2ef9f56c8175e9938f33a7650abc0b5ef0e638ee33a15fd5eee5cc90aba" exitCode=0 Mar 08 03:13:20.525392 master-0 kubenswrapper[13046]: I0308 03:13:20.525371 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:20.528028 master-0 kubenswrapper[13046]: I0308 03:13:20.527987 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:20.528092 master-0 kubenswrapper[13046]: I0308 03:13:20.528050 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:20.528092 master-0 kubenswrapper[13046]: I0308 03:13:20.528068 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:21.062632 master-0 kubenswrapper[13046]: I0308 03:13:21.062474 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:21.106601 master-0 kubenswrapper[13046]: W0308 03:13:21.106277 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:21.106601 master-0 kubenswrapper[13046]: E0308 03:13:21.106355 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:21.210537 master-0 kubenswrapper[13046]: E0308 03:13:21.210429 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:13:21.414906 master-0 kubenswrapper[13046]: W0308 03:13:21.414792 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:21.414906 master-0 kubenswrapper[13046]: E0308 03:13:21.414896 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:21.530747 master-0 kubenswrapper[13046]: I0308 03:13:21.530694 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:21.533664 master-0 kubenswrapper[13046]: I0308 03:13:21.533617 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:21.533664 master-0 kubenswrapper[13046]: I0308 03:13:21.533662 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:21.533821 master-0 kubenswrapper[13046]: I0308 03:13:21.533676 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:21.823536 master-0 kubenswrapper[13046]: I0308 03:13:21.823404 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:21.825658 master-0 kubenswrapper[13046]: I0308 03:13:21.825615 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:21.825658 master-0 kubenswrapper[13046]: I0308 03:13:21.825655 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:21.825658 master-0 kubenswrapper[13046]: I0308 03:13:21.825666 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:21.825995 master-0 kubenswrapper[13046]: I0308 03:13:21.825687 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:21.826428 master-0 kubenswrapper[13046]: E0308 03:13:21.826362 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:22.062196 master-0 kubenswrapper[13046]: I0308 03:13:22.062134 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:22.120177 master-0 kubenswrapper[13046]: W0308 03:13:22.120053 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:22.120177 master-0 kubenswrapper[13046]: E0308 03:13:22.120124 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:22.554665 master-0 kubenswrapper[13046]: W0308 03:13:22.554432 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:22.554665 master-0 kubenswrapper[13046]: E0308 03:13:22.554536 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:23.061874 master-0 kubenswrapper[13046]: I0308 03:13:23.061826 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:24.061963 master-0 kubenswrapper[13046]: I0308 03:13:24.061898 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:24.412661 master-0 kubenswrapper[13046]: E0308 03:13:24.412523 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:13:24.547141 master-0 kubenswrapper[13046]: I0308 03:13:24.547053 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_2dc664e3-7f37-4fba-8104-544ffb18c1bd/installer/0.log" Mar 08 03:13:24.547391 master-0 kubenswrapper[13046]: I0308 03:13:24.547141 13046 generic.go:334] "Generic (PLEG): container finished" podID="2dc664e3-7f37-4fba-8104-544ffb18c1bd" containerID="68c94d100f2836b6f0dea34646419e405565e371a80c5355bfca798f46638f44" exitCode=1 Mar 08 03:13:24.549975 master-0 kubenswrapper[13046]: I0308 03:13:24.549941 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_acb74744-fb99-4663-a7d0-7bae2db205e9/installer/0.log" Mar 08 03:13:24.550114 master-0 kubenswrapper[13046]: I0308 03:13:24.550012 13046 generic.go:334] "Generic (PLEG): container finished" podID="acb74744-fb99-4663-a7d0-7bae2db205e9" containerID="cc85031403c41701fdfa514e870d79fe56e4ed3f33238513795cdc2323e4fac2" exitCode=1 Mar 08 03:13:24.552644 master-0 kubenswrapper[13046]: I0308 03:13:24.552599 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_e81d3c37-e8d7-44c8-973e-13992380ce85/installer/0.log" Mar 08 03:13:24.552774 master-0 kubenswrapper[13046]: I0308 03:13:24.552744 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ed2025e9779bcea66b573fc8874abd53d39f18eed8cc8f778680f84e7d7e30" Mar 08 03:13:24.554714 master-0 kubenswrapper[13046]: I0308 03:13:24.554682 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b767f72fbe851c0148683712fba4f0872103808c8eb0533886fa5261badacc5" Mar 08 03:13:24.918675 master-0 kubenswrapper[13046]: W0308 03:13:24.915982 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:24.918675 master-0 kubenswrapper[13046]: E0308 03:13:24.916109 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:25.027524 master-0 kubenswrapper[13046]: I0308 03:13:25.027417 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:25.030071 master-0 kubenswrapper[13046]: I0308 03:13:25.030016 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:25.030071 master-0 kubenswrapper[13046]: I0308 03:13:25.030050 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:25.030071 master-0 kubenswrapper[13046]: I0308 03:13:25.030058 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:25.030071 master-0 kubenswrapper[13046]: I0308 03:13:25.030074 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:25.031051 master-0 kubenswrapper[13046]: E0308 03:13:25.030943 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:25.048169 master-0 kubenswrapper[13046]: W0308 03:13:25.048052 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:25.048169 master-0 kubenswrapper[13046]: E0308 03:13:25.048134 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:25.061984 master-0 kubenswrapper[13046]: I0308 03:13:25.061866 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:25.191733 master-0 kubenswrapper[13046]: W0308 03:13:25.191673 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1092f2a6_865c_4706_bba7_068621e85ebc.slice/crio-54132f811db6c6fd3510c48006dd74aa0959e0085c8d234c70a564b8301df7d9 WatchSource:0}: Error finding container 54132f811db6c6fd3510c48006dd74aa0959e0085c8d234c70a564b8301df7d9: Status 404 returned error can't find the container with id 54132f811db6c6fd3510c48006dd74aa0959e0085c8d234c70a564b8301df7d9 Mar 08 03:13:25.375306 master-0 kubenswrapper[13046]: W0308 03:13:25.370541 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:25.375306 master-0 kubenswrapper[13046]: E0308 03:13:25.370681 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:25.560034 master-0 kubenswrapper[13046]: I0308 03:13:25.559983 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a" exitCode=1 Mar 08 03:13:25.560170 master-0 kubenswrapper[13046]: I0308 03:13:25.560037 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a"} Mar 08 03:13:25.560170 master-0 kubenswrapper[13046]: I0308 03:13:25.560079 13046 scope.go:117] "RemoveContainer" containerID="a1a35e1982598cdecae2fbe81adb4750009ffd2dbc74a0fa47202142f17fff42" Mar 08 03:13:25.560302 master-0 kubenswrapper[13046]: I0308 03:13:25.560195 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:25.563341 master-0 kubenswrapper[13046]: I0308 03:13:25.563280 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459" exitCode=1 Mar 08 03:13:25.563341 master-0 kubenswrapper[13046]: I0308 03:13:25.563334 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459"} Mar 08 03:13:25.563558 master-0 kubenswrapper[13046]: I0308 03:13:25.563459 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:25.564078 master-0 kubenswrapper[13046]: I0308 03:13:25.564035 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:25.564078 master-0 kubenswrapper[13046]: I0308 03:13:25.564065 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:25.564078 master-0 kubenswrapper[13046]: I0308 03:13:25.564073 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:25.564426 master-0 kubenswrapper[13046]: I0308 03:13:25.564393 13046 scope.go:117] "RemoveContainer" containerID="9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a" Mar 08 03:13:25.565689 master-0 kubenswrapper[13046]: I0308 03:13:25.565641 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:25.565689 master-0 kubenswrapper[13046]: I0308 03:13:25.565678 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:25.565689 master-0 kubenswrapper[13046]: I0308 03:13:25.565688 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:25.566058 master-0 kubenswrapper[13046]: I0308 03:13:25.566029 13046 scope.go:117] "RemoveContainer" containerID="0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459" Mar 08 03:13:25.833583 master-0 kubenswrapper[13046]: I0308 03:13:25.833534 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:25.838960 master-0 kubenswrapper[13046]: I0308 03:13:25.838895 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:26.062731 master-0 kubenswrapper[13046]: I0308 03:13:26.062651 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:26.576678 master-0 kubenswrapper[13046]: I0308 03:13:26.573474 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597"} Mar 08 03:13:26.576678 master-0 kubenswrapper[13046]: I0308 03:13:26.573695 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:26.577608 master-0 kubenswrapper[13046]: I0308 03:13:26.577551 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:26.577608 master-0 kubenswrapper[13046]: I0308 03:13:26.577610 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:26.577608 master-0 kubenswrapper[13046]: I0308 03:13:26.577620 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:26.581611 master-0 kubenswrapper[13046]: I0308 03:13:26.580144 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765"} Mar 08 03:13:26.581611 master-0 kubenswrapper[13046]: I0308 03:13:26.580302 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:26.591549 master-0 kubenswrapper[13046]: I0308 03:13:26.590964 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:26.591549 master-0 kubenswrapper[13046]: I0308 03:13:26.590992 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:26.591549 master-0 kubenswrapper[13046]: I0308 03:13:26.591001 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:26.604533 master-0 kubenswrapper[13046]: I0308 03:13:26.596359 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:26.914495 master-0 kubenswrapper[13046]: E0308 03:13:26.914309 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:13:27.062111 master-0 kubenswrapper[13046]: I0308 03:13:27.062037 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:27.305244 master-0 kubenswrapper[13046]: W0308 03:13:27.305103 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:27.305244 master-0 kubenswrapper[13046]: E0308 03:13:27.305230 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:27.592596 master-0 kubenswrapper[13046]: I0308 03:13:27.592479 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" exitCode=1 Mar 08 03:13:27.592838 master-0 kubenswrapper[13046]: I0308 03:13:27.592586 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765"} Mar 08 03:13:27.592838 master-0 kubenswrapper[13046]: I0308 03:13:27.592656 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:27.592838 master-0 kubenswrapper[13046]: I0308 03:13:27.592715 13046 scope.go:117] "RemoveContainer" containerID="9e6b8d3a0f03e9035732338fcc893d4e73b26cab45767d3a7fcf55c614fe104a" Mar 08 03:13:27.596038 master-0 kubenswrapper[13046]: I0308 03:13:27.595973 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:27.596038 master-0 kubenswrapper[13046]: I0308 03:13:27.596036 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:27.596408 master-0 kubenswrapper[13046]: I0308 03:13:27.596061 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:27.596886 master-0 kubenswrapper[13046]: I0308 03:13:27.596810 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:27.597815 master-0 kubenswrapper[13046]: E0308 03:13:27.597272 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:28.061938 master-0 kubenswrapper[13046]: I0308 03:13:28.061779 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:28.599027 master-0 kubenswrapper[13046]: I0308 03:13:28.598961 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:28.601855 master-0 kubenswrapper[13046]: I0308 03:13:28.601778 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:28.601855 master-0 kubenswrapper[13046]: I0308 03:13:28.601859 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:28.602102 master-0 kubenswrapper[13046]: I0308 03:13:28.601880 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:28.602562 master-0 kubenswrapper[13046]: I0308 03:13:28.602474 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:28.602930 master-0 kubenswrapper[13046]: E0308 03:13:28.602872 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:28.718180 master-0 kubenswrapper[13046]: E0308 03:13:28.718130 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:13:29.062599 master-0 kubenswrapper[13046]: I0308 03:13:29.062387 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:29.178261 master-0 kubenswrapper[13046]: I0308 03:13:29.178167 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:29.178261 master-0 kubenswrapper[13046]: I0308 03:13:29.178265 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:29.606344 master-0 kubenswrapper[13046]: I0308 03:13:29.606230 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:29.610053 master-0 kubenswrapper[13046]: I0308 03:13:29.609907 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:29.610053 master-0 kubenswrapper[13046]: I0308 03:13:29.609951 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:29.610053 master-0 kubenswrapper[13046]: I0308 03:13:29.609962 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:29.610422 master-0 kubenswrapper[13046]: I0308 03:13:29.610383 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:29.610705 master-0 kubenswrapper[13046]: E0308 03:13:29.610653 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:30.062124 master-0 kubenswrapper[13046]: I0308 03:13:30.062023 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:30.491210 master-0 kubenswrapper[13046]: I0308 03:13:30.491126 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:30.611234 master-0 kubenswrapper[13046]: I0308 03:13:30.611170 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:30.613446 master-0 kubenswrapper[13046]: I0308 03:13:30.613409 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:30.613629 master-0 kubenswrapper[13046]: I0308 03:13:30.613468 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:30.613629 master-0 kubenswrapper[13046]: I0308 03:13:30.613523 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:30.613890 master-0 kubenswrapper[13046]: I0308 03:13:30.613861 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:30.614137 master-0 kubenswrapper[13046]: E0308 03:13:30.614082 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:30.814714 master-0 kubenswrapper[13046]: E0308 03:13:30.814479 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:13:31.061886 master-0 kubenswrapper[13046]: I0308 03:13:31.061786 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:31.432305 master-0 kubenswrapper[13046]: I0308 03:13:31.432234 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:31.435946 master-0 kubenswrapper[13046]: I0308 03:13:31.435893 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:31.436082 master-0 kubenswrapper[13046]: I0308 03:13:31.435969 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:31.436082 master-0 kubenswrapper[13046]: I0308 03:13:31.435990 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:31.436082 master-0 kubenswrapper[13046]: I0308 03:13:31.436029 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:31.437515 master-0 kubenswrapper[13046]: E0308 03:13:31.437421 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:32.062882 master-0 kubenswrapper[13046]: I0308 03:13:32.062767 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:33.061312 master-0 kubenswrapper[13046]: I0308 03:13:33.061224 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:33.658043 master-0 kubenswrapper[13046]: W0308 03:13:33.657950 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:33.658654 master-0 kubenswrapper[13046]: E0308 03:13:33.658051 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:34.062827 master-0 kubenswrapper[13046]: I0308 03:13:34.062672 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:35.062572 master-0 kubenswrapper[13046]: I0308 03:13:35.062453 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:35.509179 master-0 kubenswrapper[13046]: I0308 03:13:35.509083 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:35.509564 master-0 kubenswrapper[13046]: I0308 03:13:35.509397 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:35.513644 master-0 kubenswrapper[13046]: I0308 03:13:35.513581 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:35.513644 master-0 kubenswrapper[13046]: I0308 03:13:35.513643 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:35.513912 master-0 kubenswrapper[13046]: I0308 03:13:35.513661 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:35.514476 master-0 kubenswrapper[13046]: I0308 03:13:35.514411 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:35.514905 master-0 kubenswrapper[13046]: E0308 03:13:35.514856 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:36.062227 master-0 kubenswrapper[13046]: I0308 03:13:36.062137 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:36.146514 master-0 kubenswrapper[13046]: W0308 03:13:36.146347 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:36.146514 master-0 kubenswrapper[13046]: E0308 03:13:36.146471 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:36.231383 master-0 kubenswrapper[13046]: W0308 03:13:36.231249 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:36.231383 master-0 kubenswrapper[13046]: E0308 03:13:36.231364 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:36.915948 master-0 kubenswrapper[13046]: E0308 03:13:36.915659 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:13:37.061951 master-0 kubenswrapper[13046]: I0308 03:13:37.061865 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:37.300683 master-0 kubenswrapper[13046]: W0308 03:13:37.300429 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:37.300683 master-0 kubenswrapper[13046]: E0308 03:13:37.300560 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:37.816693 master-0 kubenswrapper[13046]: E0308 03:13:37.816619 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:13:38.062249 master-0 kubenswrapper[13046]: I0308 03:13:38.062118 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:38.437665 master-0 kubenswrapper[13046]: I0308 03:13:38.437556 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:38.441336 master-0 kubenswrapper[13046]: I0308 03:13:38.441271 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:38.441336 master-0 kubenswrapper[13046]: I0308 03:13:38.441328 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:38.441336 master-0 kubenswrapper[13046]: I0308 03:13:38.441345 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:38.441674 master-0 kubenswrapper[13046]: I0308 03:13:38.441374 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:38.442632 master-0 kubenswrapper[13046]: E0308 03:13:38.442551 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:38.718511 master-0 kubenswrapper[13046]: E0308 03:13:38.718331 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:13:39.062657 master-0 kubenswrapper[13046]: I0308 03:13:39.062546 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:40.062463 master-0 kubenswrapper[13046]: I0308 03:13:40.062315 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:41.062337 master-0 kubenswrapper[13046]: I0308 03:13:41.062257 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:42.062894 master-0 kubenswrapper[13046]: I0308 03:13:42.062727 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:43.062530 master-0 kubenswrapper[13046]: I0308 03:13:43.062432 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:44.061935 master-0 kubenswrapper[13046]: I0308 03:13:44.061842 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:44.711390 master-0 kubenswrapper[13046]: I0308 03:13:44.711327 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/0.log" Mar 08 03:13:44.712102 master-0 kubenswrapper[13046]: I0308 03:13:44.712043 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" containerID="0f90c7e80ee619a77867feffa666b20dfa8fad2e9ecc5d700b999460ff6d737b" exitCode=1 Mar 08 03:13:44.819527 master-0 kubenswrapper[13046]: E0308 03:13:44.819401 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:13:45.063095 master-0 kubenswrapper[13046]: I0308 03:13:45.062913 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:45.443284 master-0 kubenswrapper[13046]: I0308 03:13:45.443114 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:45.446631 master-0 kubenswrapper[13046]: I0308 03:13:45.446572 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:45.446737 master-0 kubenswrapper[13046]: I0308 03:13:45.446644 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:45.446737 master-0 kubenswrapper[13046]: I0308 03:13:45.446669 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:45.446737 master-0 kubenswrapper[13046]: I0308 03:13:45.446704 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:45.447999 master-0 kubenswrapper[13046]: E0308 03:13:45.447941 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:46.062068 master-0 kubenswrapper[13046]: I0308 03:13:46.061943 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:46.917452 master-0 kubenswrapper[13046]: E0308 03:13:46.917229 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:13:47.062214 master-0 kubenswrapper[13046]: I0308 03:13:47.062101 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:47.117944 master-0 kubenswrapper[13046]: I0308 03:13:47.117869 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:47.121316 master-0 kubenswrapper[13046]: I0308 03:13:47.121230 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:47.121316 master-0 kubenswrapper[13046]: I0308 03:13:47.121312 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:47.121622 master-0 kubenswrapper[13046]: I0308 03:13:47.121330 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:47.122107 master-0 kubenswrapper[13046]: I0308 03:13:47.122008 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:47.743771 master-0 kubenswrapper[13046]: I0308 03:13:47.743724 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b"} Mar 08 03:13:47.743950 master-0 kubenswrapper[13046]: I0308 03:13:47.743832 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:47.745743 master-0 kubenswrapper[13046]: I0308 03:13:47.745711 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:47.745785 master-0 kubenswrapper[13046]: I0308 03:13:47.745762 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:47.745785 master-0 kubenswrapper[13046]: I0308 03:13:47.745780 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:48.062200 master-0 kubenswrapper[13046]: I0308 03:13:48.062031 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:48.720284 master-0 kubenswrapper[13046]: E0308 03:13:48.720210 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:13:48.756155 master-0 kubenswrapper[13046]: I0308 03:13:48.756051 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" exitCode=1 Mar 08 03:13:48.756155 master-0 kubenswrapper[13046]: I0308 03:13:48.756115 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b"} Mar 08 03:13:48.756155 master-0 kubenswrapper[13046]: I0308 03:13:48.756166 13046 scope.go:117] "RemoveContainer" containerID="4f2927f031452f1070f2426fc116b397f5df5872c99ddb86c1b7a0b01fedb765" Mar 08 03:13:48.756577 master-0 kubenswrapper[13046]: I0308 03:13:48.756549 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:48.759956 master-0 kubenswrapper[13046]: I0308 03:13:48.759539 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:48.759956 master-0 kubenswrapper[13046]: I0308 03:13:48.759596 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:48.759956 master-0 kubenswrapper[13046]: I0308 03:13:48.759620 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:48.760344 master-0 kubenswrapper[13046]: I0308 03:13:48.760295 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:13:48.760927 master-0 kubenswrapper[13046]: E0308 03:13:48.760880 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:49.062807 master-0 kubenswrapper[13046]: I0308 03:13:49.062562 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:49.178388 master-0 kubenswrapper[13046]: I0308 03:13:49.178300 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:49.178388 master-0 kubenswrapper[13046]: I0308 03:13:49.178381 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:49.766854 master-0 kubenswrapper[13046]: I0308 03:13:49.766794 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:49.770642 master-0 kubenswrapper[13046]: I0308 03:13:49.770560 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:49.770837 master-0 kubenswrapper[13046]: I0308 03:13:49.770800 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:49.770946 master-0 kubenswrapper[13046]: I0308 03:13:49.770845 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:49.771603 master-0 kubenswrapper[13046]: I0308 03:13:49.771563 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:13:49.772090 master-0 kubenswrapper[13046]: E0308 03:13:49.772017 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:50.062197 master-0 kubenswrapper[13046]: I0308 03:13:50.061984 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:50.490733 master-0 kubenswrapper[13046]: I0308 03:13:50.490630 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:50.776227 master-0 kubenswrapper[13046]: I0308 03:13:50.776049 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:50.780081 master-0 kubenswrapper[13046]: I0308 03:13:50.780010 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:50.780081 master-0 kubenswrapper[13046]: I0308 03:13:50.780071 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:50.780340 master-0 kubenswrapper[13046]: I0308 03:13:50.780094 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:50.780742 master-0 kubenswrapper[13046]: I0308 03:13:50.780678 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:13:50.781170 master-0 kubenswrapper[13046]: E0308 03:13:50.781095 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:51.061735 master-0 kubenswrapper[13046]: I0308 03:13:51.061610 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:51.820737 master-0 kubenswrapper[13046]: E0308 03:13:51.820666 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:13:52.018233 master-0 kubenswrapper[13046]: W0308 03:13:52.018132 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:52.018553 master-0 kubenswrapper[13046]: E0308 03:13:52.018258 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:52.062840 master-0 kubenswrapper[13046]: I0308 03:13:52.062754 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:52.448686 master-0 kubenswrapper[13046]: I0308 03:13:52.448583 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:52.457843 master-0 kubenswrapper[13046]: I0308 03:13:52.457712 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:52.457843 master-0 kubenswrapper[13046]: I0308 03:13:52.457795 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:52.457843 master-0 kubenswrapper[13046]: I0308 03:13:52.457815 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:52.457843 master-0 kubenswrapper[13046]: I0308 03:13:52.457849 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:52.459239 master-0 kubenswrapper[13046]: E0308 03:13:52.459178 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:52.792797 master-0 kubenswrapper[13046]: I0308 03:13:52.792626 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_0781e6af-f5b5-40f7-bb7f-5bc6978b4957/installer/0.log" Mar 08 03:13:52.792797 master-0 kubenswrapper[13046]: I0308 03:13:52.792728 13046 generic.go:334] "Generic (PLEG): container finished" podID="0781e6af-f5b5-40f7-bb7f-5bc6978b4957" containerID="d6a6af9b5c35efad9748f9601d83a886b44fae8599777699f25ccf5aa2fcd4b8" exitCode=1 Mar 08 03:13:53.062386 master-0 kubenswrapper[13046]: I0308 03:13:53.062271 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:54.062086 master-0 kubenswrapper[13046]: I0308 03:13:54.061972 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:54.863019 master-0 kubenswrapper[13046]: W0308 03:13:54.862896 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:54.863906 master-0 kubenswrapper[13046]: E0308 03:13:54.863019 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:54.951263 master-0 kubenswrapper[13046]: W0308 03:13:54.951114 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:54.951599 master-0 kubenswrapper[13046]: E0308 03:13:54.951266 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:13:55.062158 master-0 kubenswrapper[13046]: I0308 03:13:55.062083 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:55.509216 master-0 kubenswrapper[13046]: I0308 03:13:55.509163 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:13:55.509743 master-0 kubenswrapper[13046]: I0308 03:13:55.509711 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:55.513186 master-0 kubenswrapper[13046]: I0308 03:13:55.513148 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:55.513399 master-0 kubenswrapper[13046]: I0308 03:13:55.513370 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:55.513653 master-0 kubenswrapper[13046]: I0308 03:13:55.513588 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:55.514409 master-0 kubenswrapper[13046]: I0308 03:13:55.514380 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:13:55.514923 master-0 kubenswrapper[13046]: E0308 03:13:55.514888 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:13:56.062572 master-0 kubenswrapper[13046]: I0308 03:13:56.062431 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:56.919260 master-0 kubenswrapper[13046]: E0308 03:13:56.919077 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:13:57.062125 master-0 kubenswrapper[13046]: I0308 03:13:57.062063 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:58.061864 master-0 kubenswrapper[13046]: I0308 03:13:58.061767 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:58.720563 master-0 kubenswrapper[13046]: E0308 03:13:58.720499 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:13:58.822312 master-0 kubenswrapper[13046]: E0308 03:13:58.822230 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:13:59.062414 master-0 kubenswrapper[13046]: I0308 03:13:59.062221 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:59.459529 master-0 kubenswrapper[13046]: I0308 03:13:59.459452 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:13:59.462439 master-0 kubenswrapper[13046]: I0308 03:13:59.462368 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:13:59.462439 master-0 kubenswrapper[13046]: I0308 03:13:59.462443 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:13:59.462668 master-0 kubenswrapper[13046]: I0308 03:13:59.462468 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:13:59.462668 master-0 kubenswrapper[13046]: I0308 03:13:59.462529 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:13:59.463750 master-0 kubenswrapper[13046]: E0308 03:13:59.463660 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:13:59.765959 master-0 kubenswrapper[13046]: W0308 03:13:59.765755 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:13:59.765959 master-0 kubenswrapper[13046]: E0308 03:13:59.765868 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:14:00.061972 master-0 kubenswrapper[13046]: I0308 03:14:00.061845 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:01.061350 master-0 kubenswrapper[13046]: I0308 03:14:01.061271 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:02.062884 master-0 kubenswrapper[13046]: I0308 03:14:02.062772 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:03.062539 master-0 kubenswrapper[13046]: I0308 03:14:03.062409 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:04.062619 master-0 kubenswrapper[13046]: I0308 03:14:04.062425 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:05.062316 master-0 kubenswrapper[13046]: I0308 03:14:05.062197 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:05.823955 master-0 kubenswrapper[13046]: E0308 03:14:05.823820 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:06.062383 master-0 kubenswrapper[13046]: I0308 03:14:06.062288 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:06.464767 master-0 kubenswrapper[13046]: I0308 03:14:06.464653 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:06.468879 master-0 kubenswrapper[13046]: I0308 03:14:06.468802 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:06.468879 master-0 kubenswrapper[13046]: I0308 03:14:06.468880 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:06.469084 master-0 kubenswrapper[13046]: I0308 03:14:06.468904 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:06.469084 master-0 kubenswrapper[13046]: I0308 03:14:06.468941 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:06.470428 master-0 kubenswrapper[13046]: E0308 03:14:06.470330 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:06.921283 master-0 kubenswrapper[13046]: E0308 03:14:06.921075 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:07.061395 master-0 kubenswrapper[13046]: I0308 03:14:07.061319 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:08.061888 master-0 kubenswrapper[13046]: I0308 03:14:08.061784 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:08.720779 master-0 kubenswrapper[13046]: E0308 03:14:08.720723 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:09.062848 master-0 kubenswrapper[13046]: I0308 03:14:09.062682 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:10.062938 master-0 kubenswrapper[13046]: I0308 03:14:10.062801 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:10.118652 master-0 kubenswrapper[13046]: I0308 03:14:10.118452 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:10.122412 master-0 kubenswrapper[13046]: I0308 03:14:10.122331 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:10.122412 master-0 kubenswrapper[13046]: I0308 03:14:10.122390 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:10.122412 master-0 kubenswrapper[13046]: I0308 03:14:10.122401 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:10.122934 master-0 kubenswrapper[13046]: I0308 03:14:10.122898 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:14:10.925735 master-0 kubenswrapper[13046]: I0308 03:14:10.925602 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc"} Mar 08 03:14:10.926012 master-0 kubenswrapper[13046]: I0308 03:14:10.925811 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:10.929270 master-0 kubenswrapper[13046]: I0308 03:14:10.929197 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:10.929270 master-0 kubenswrapper[13046]: I0308 03:14:10.929263 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:10.929472 master-0 kubenswrapper[13046]: I0308 03:14:10.929284 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:11.063024 master-0 kubenswrapper[13046]: I0308 03:14:11.062926 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:11.937231 master-0 kubenswrapper[13046]: I0308 03:14:11.937103 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" exitCode=1 Mar 08 03:14:11.937231 master-0 kubenswrapper[13046]: I0308 03:14:11.937191 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc"} Mar 08 03:14:11.937680 master-0 kubenswrapper[13046]: I0308 03:14:11.937265 13046 scope.go:117] "RemoveContainer" containerID="228e1154f4e941097c2a4a15d6969d7eb424c6dfdbcdd6b1ca10aec02aa5105b" Mar 08 03:14:11.937680 master-0 kubenswrapper[13046]: I0308 03:14:11.937534 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:11.942797 master-0 kubenswrapper[13046]: I0308 03:14:11.942731 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:11.942951 master-0 kubenswrapper[13046]: I0308 03:14:11.942812 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:11.942951 master-0 kubenswrapper[13046]: I0308 03:14:11.942844 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:11.943457 master-0 kubenswrapper[13046]: I0308 03:14:11.943406 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:11.943958 master-0 kubenswrapper[13046]: E0308 03:14:11.943887 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:12.062806 master-0 kubenswrapper[13046]: I0308 03:14:12.062688 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:12.826112 master-0 kubenswrapper[13046]: E0308 03:14:12.826022 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:13.062431 master-0 kubenswrapper[13046]: I0308 03:14:13.062376 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:13.471435 master-0 kubenswrapper[13046]: I0308 03:14:13.471318 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:13.475053 master-0 kubenswrapper[13046]: I0308 03:14:13.474998 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:13.475053 master-0 kubenswrapper[13046]: I0308 03:14:13.475054 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:13.475257 master-0 kubenswrapper[13046]: I0308 03:14:13.475074 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:13.475257 master-0 kubenswrapper[13046]: I0308 03:14:13.475104 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:13.476296 master-0 kubenswrapper[13046]: E0308 03:14:13.476228 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:13.956046 master-0 kubenswrapper[13046]: I0308 03:14:13.955971 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/0.log" Mar 08 03:14:13.956861 master-0 kubenswrapper[13046]: I0308 03:14:13.956052 13046 generic.go:334] "Generic (PLEG): container finished" podID="fd6b827c-70b0-47ed-b07c-c696343248a8" containerID="927e976b2419f80e2b156dd6620627f0ab5b15535fdab986491afec086084730" exitCode=1 Mar 08 03:14:14.062449 master-0 kubenswrapper[13046]: I0308 03:14:14.062364 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:15.062225 master-0 kubenswrapper[13046]: I0308 03:14:15.062085 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:15.509545 master-0 kubenswrapper[13046]: I0308 03:14:15.509368 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:14:15.509898 master-0 kubenswrapper[13046]: I0308 03:14:15.509638 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:15.513150 master-0 kubenswrapper[13046]: I0308 03:14:15.513096 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:15.513150 master-0 kubenswrapper[13046]: I0308 03:14:15.513153 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:15.513328 master-0 kubenswrapper[13046]: I0308 03:14:15.513172 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:15.513783 master-0 kubenswrapper[13046]: I0308 03:14:15.513737 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:15.514111 master-0 kubenswrapper[13046]: E0308 03:14:15.514061 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:16.062478 master-0 kubenswrapper[13046]: I0308 03:14:16.062374 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:16.923598 master-0 kubenswrapper[13046]: E0308 03:14:16.923346 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:17.062285 master-0 kubenswrapper[13046]: I0308 03:14:17.062218 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:18.061672 master-0 kubenswrapper[13046]: I0308 03:14:18.061580 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:18.721060 master-0 kubenswrapper[13046]: E0308 03:14:18.721004 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:19.062299 master-0 kubenswrapper[13046]: I0308 03:14:19.062103 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:19.177453 master-0 kubenswrapper[13046]: I0308 03:14:19.177341 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:14:19.177453 master-0 kubenswrapper[13046]: I0308 03:14:19.177460 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:14:19.177825 master-0 kubenswrapper[13046]: I0308 03:14:19.177732 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:19.180959 master-0 kubenswrapper[13046]: I0308 03:14:19.180918 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:19.180959 master-0 kubenswrapper[13046]: I0308 03:14:19.180963 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:19.181185 master-0 kubenswrapper[13046]: I0308 03:14:19.180980 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:19.181558 master-0 kubenswrapper[13046]: I0308 03:14:19.181522 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:19.181917 master-0 kubenswrapper[13046]: E0308 03:14:19.181848 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:19.827859 master-0 kubenswrapper[13046]: E0308 03:14:19.827621 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:20.062329 master-0 kubenswrapper[13046]: I0308 03:14:20.062213 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:20.477451 master-0 kubenswrapper[13046]: I0308 03:14:20.477352 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:20.480941 master-0 kubenswrapper[13046]: I0308 03:14:20.480883 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:20.480941 master-0 kubenswrapper[13046]: I0308 03:14:20.480937 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:20.481156 master-0 kubenswrapper[13046]: I0308 03:14:20.480952 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:20.481156 master-0 kubenswrapper[13046]: I0308 03:14:20.480977 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:20.482005 master-0 kubenswrapper[13046]: E0308 03:14:20.481931 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:20.491159 master-0 kubenswrapper[13046]: I0308 03:14:20.491116 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:14:20.491360 master-0 kubenswrapper[13046]: I0308 03:14:20.491314 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:20.494735 master-0 kubenswrapper[13046]: I0308 03:14:20.494679 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:20.494822 master-0 kubenswrapper[13046]: I0308 03:14:20.494765 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:20.494822 master-0 kubenswrapper[13046]: I0308 03:14:20.494792 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:20.495593 master-0 kubenswrapper[13046]: I0308 03:14:20.495535 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:20.496031 master-0 kubenswrapper[13046]: E0308 03:14:20.495985 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:21.062858 master-0 kubenswrapper[13046]: I0308 03:14:21.062728 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:22.062718 master-0 kubenswrapper[13046]: I0308 03:14:22.062619 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:23.061655 master-0 kubenswrapper[13046]: I0308 03:14:23.061584 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:24.061850 master-0 kubenswrapper[13046]: I0308 03:14:24.061754 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:25.062042 master-0 kubenswrapper[13046]: I0308 03:14:25.061951 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:25.711995 master-0 kubenswrapper[13046]: W0308 03:14:25.711869 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:25.712278 master-0 kubenswrapper[13046]: E0308 03:14:25.711999 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:14:26.062938 master-0 kubenswrapper[13046]: I0308 03:14:26.062761 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:26.829277 master-0 kubenswrapper[13046]: E0308 03:14:26.829149 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:26.925260 master-0 kubenswrapper[13046]: E0308 03:14:26.924990 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:27.062376 master-0 kubenswrapper[13046]: I0308 03:14:27.062266 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:27.117744 master-0 kubenswrapper[13046]: I0308 03:14:27.117654 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:27.124573 master-0 kubenswrapper[13046]: I0308 03:14:27.122393 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:27.124573 master-0 kubenswrapper[13046]: I0308 03:14:27.122474 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:27.124573 master-0 kubenswrapper[13046]: I0308 03:14:27.122533 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:27.482777 master-0 kubenswrapper[13046]: I0308 03:14:27.482590 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:27.486772 master-0 kubenswrapper[13046]: I0308 03:14:27.486703 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:27.486772 master-0 kubenswrapper[13046]: I0308 03:14:27.486774 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:27.487084 master-0 kubenswrapper[13046]: I0308 03:14:27.486799 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:27.487084 master-0 kubenswrapper[13046]: I0308 03:14:27.486837 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:27.488201 master-0 kubenswrapper[13046]: E0308 03:14:27.488116 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:27.768762 master-0 kubenswrapper[13046]: W0308 03:14:27.768529 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:27.768762 master-0 kubenswrapper[13046]: E0308 03:14:27.768651 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:14:28.062024 master-0 kubenswrapper[13046]: I0308 03:14:28.061800 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:28.721692 master-0 kubenswrapper[13046]: E0308 03:14:28.721648 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:29.062226 master-0 kubenswrapper[13046]: I0308 03:14:29.061988 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:29.117802 master-0 kubenswrapper[13046]: I0308 03:14:29.117723 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:29.121008 master-0 kubenswrapper[13046]: I0308 03:14:29.120966 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:29.121079 master-0 kubenswrapper[13046]: I0308 03:14:29.121022 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:29.121079 master-0 kubenswrapper[13046]: I0308 03:14:29.121041 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:30.061895 master-0 kubenswrapper[13046]: I0308 03:14:30.061821 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:31.061970 master-0 kubenswrapper[13046]: I0308 03:14:31.061884 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:32.062370 master-0 kubenswrapper[13046]: I0308 03:14:32.062265 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:33.062234 master-0 kubenswrapper[13046]: I0308 03:14:33.062135 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:33.079640 master-0 kubenswrapper[13046]: I0308 03:14:33.079550 13046 generic.go:334] "Generic (PLEG): container finished" podID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerID="df3e8baabefc90e04c02f0f45ed7aa89841f1f4954012b9c683b090559c5e516" exitCode=0 Mar 08 03:14:33.831343 master-0 kubenswrapper[13046]: E0308 03:14:33.831214 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:34.062349 master-0 kubenswrapper[13046]: I0308 03:14:34.062250 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:34.488742 master-0 kubenswrapper[13046]: I0308 03:14:34.488617 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:34.492756 master-0 kubenswrapper[13046]: I0308 03:14:34.492577 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:34.492756 master-0 kubenswrapper[13046]: I0308 03:14:34.492643 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:34.492756 master-0 kubenswrapper[13046]: I0308 03:14:34.492670 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:34.492756 master-0 kubenswrapper[13046]: I0308 03:14:34.492719 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:34.494240 master-0 kubenswrapper[13046]: E0308 03:14:34.494167 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:35.062445 master-0 kubenswrapper[13046]: I0308 03:14:35.062288 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:35.117936 master-0 kubenswrapper[13046]: I0308 03:14:35.117854 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:35.121558 master-0 kubenswrapper[13046]: I0308 03:14:35.121441 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:35.121558 master-0 kubenswrapper[13046]: I0308 03:14:35.121555 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:35.121759 master-0 kubenswrapper[13046]: I0308 03:14:35.121582 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:35.122314 master-0 kubenswrapper[13046]: I0308 03:14:35.122265 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:35.122739 master-0 kubenswrapper[13046]: E0308 03:14:35.122672 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:36.062250 master-0 kubenswrapper[13046]: I0308 03:14:36.062164 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:36.926713 master-0 kubenswrapper[13046]: E0308 03:14:36.926410 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:37.062209 master-0 kubenswrapper[13046]: I0308 03:14:37.062132 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:38.062803 master-0 kubenswrapper[13046]: I0308 03:14:38.062709 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:38.723058 master-0 kubenswrapper[13046]: E0308 03:14:38.722978 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:39.062040 master-0 kubenswrapper[13046]: I0308 03:14:39.061872 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:40.062786 master-0 kubenswrapper[13046]: I0308 03:14:40.062622 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:40.833305 master-0 kubenswrapper[13046]: E0308 03:14:40.833120 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:41.112082 master-0 kubenswrapper[13046]: I0308 03:14:41.112033 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:41.495036 master-0 kubenswrapper[13046]: I0308 03:14:41.494878 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:41.497639 master-0 kubenswrapper[13046]: I0308 03:14:41.497557 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:41.497793 master-0 kubenswrapper[13046]: I0308 03:14:41.497644 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:41.497793 master-0 kubenswrapper[13046]: I0308 03:14:41.497676 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:41.497793 master-0 kubenswrapper[13046]: I0308 03:14:41.497721 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:41.499131 master-0 kubenswrapper[13046]: E0308 03:14:41.499064 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:42.061450 master-0 kubenswrapper[13046]: I0308 03:14:42.061362 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:42.451177 master-0 kubenswrapper[13046]: W0308 03:14:42.451031 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:42.451177 master-0 kubenswrapper[13046]: E0308 03:14:42.451151 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:14:43.062571 master-0 kubenswrapper[13046]: I0308 03:14:43.062453 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:44.062291 master-0 kubenswrapper[13046]: I0308 03:14:44.062188 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:44.118110 master-0 kubenswrapper[13046]: I0308 03:14:44.118046 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:44.122056 master-0 kubenswrapper[13046]: I0308 03:14:44.121955 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:44.122056 master-0 kubenswrapper[13046]: I0308 03:14:44.122020 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:44.122056 master-0 kubenswrapper[13046]: I0308 03:14:44.122040 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:45.062278 master-0 kubenswrapper[13046]: I0308 03:14:45.062158 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:46.061908 master-0 kubenswrapper[13046]: I0308 03:14:46.061776 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:46.118378 master-0 kubenswrapper[13046]: I0308 03:14:46.118260 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:46.121943 master-0 kubenswrapper[13046]: I0308 03:14:46.121881 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:46.122071 master-0 kubenswrapper[13046]: I0308 03:14:46.121948 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:46.122071 master-0 kubenswrapper[13046]: I0308 03:14:46.121974 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:46.122579 master-0 kubenswrapper[13046]: I0308 03:14:46.122536 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:14:46.122918 master-0 kubenswrapper[13046]: E0308 03:14:46.122866 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:14:46.929133 master-0 kubenswrapper[13046]: E0308 03:14:46.928921 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:47.037200 master-0 kubenswrapper[13046]: W0308 03:14:47.037048 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:47.037566 master-0 kubenswrapper[13046]: E0308 03:14:47.037204 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:14:47.062070 master-0 kubenswrapper[13046]: I0308 03:14:47.061983 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:47.835093 master-0 kubenswrapper[13046]: E0308 03:14:47.834982 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:48.062384 master-0 kubenswrapper[13046]: I0308 03:14:48.062292 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:48.500095 master-0 kubenswrapper[13046]: I0308 03:14:48.499928 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:48.504136 master-0 kubenswrapper[13046]: I0308 03:14:48.504075 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:48.504248 master-0 kubenswrapper[13046]: I0308 03:14:48.504151 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:48.504248 master-0 kubenswrapper[13046]: I0308 03:14:48.504171 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:48.504248 master-0 kubenswrapper[13046]: I0308 03:14:48.504200 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:48.505516 master-0 kubenswrapper[13046]: E0308 03:14:48.505424 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:48.723654 master-0 kubenswrapper[13046]: E0308 03:14:48.723526 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:49.062076 master-0 kubenswrapper[13046]: I0308 03:14:49.061967 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:50.062963 master-0 kubenswrapper[13046]: I0308 03:14:50.062813 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:51.062212 master-0 kubenswrapper[13046]: I0308 03:14:51.061991 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:52.061649 master-0 kubenswrapper[13046]: I0308 03:14:52.061475 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:53.062127 master-0 kubenswrapper[13046]: I0308 03:14:53.062043 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:54.062744 master-0 kubenswrapper[13046]: I0308 03:14:54.062622 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:54.836842 master-0 kubenswrapper[13046]: E0308 03:14:54.836715 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:14:55.061914 master-0 kubenswrapper[13046]: I0308 03:14:55.061811 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:55.506519 master-0 kubenswrapper[13046]: I0308 03:14:55.506379 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:14:55.510858 master-0 kubenswrapper[13046]: I0308 03:14:55.510800 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:14:55.510858 master-0 kubenswrapper[13046]: I0308 03:14:55.510852 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:14:55.511049 master-0 kubenswrapper[13046]: I0308 03:14:55.510871 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:14:55.511049 master-0 kubenswrapper[13046]: I0308 03:14:55.510900 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:14:55.512015 master-0 kubenswrapper[13046]: E0308 03:14:55.511948 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:14:56.062813 master-0 kubenswrapper[13046]: I0308 03:14:56.062703 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:56.931287 master-0 kubenswrapper[13046]: E0308 03:14:56.931062 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:14:57.062673 master-0 kubenswrapper[13046]: I0308 03:14:57.062470 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:58.062193 master-0 kubenswrapper[13046]: I0308 03:14:58.062094 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:14:58.724298 master-0 kubenswrapper[13046]: E0308 03:14:58.724184 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:14:59.067533 master-0 kubenswrapper[13046]: I0308 03:14:59.067361 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:00.061739 master-0 kubenswrapper[13046]: I0308 03:15:00.061667 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:01.062160 master-0 kubenswrapper[13046]: I0308 03:15:01.061936 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:01.118258 master-0 kubenswrapper[13046]: I0308 03:15:01.118126 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:01.121635 master-0 kubenswrapper[13046]: I0308 03:15:01.121583 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:01.121807 master-0 kubenswrapper[13046]: I0308 03:15:01.121638 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:01.121807 master-0 kubenswrapper[13046]: I0308 03:15:01.121661 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:01.122294 master-0 kubenswrapper[13046]: I0308 03:15:01.122240 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:15:01.278999 master-0 kubenswrapper[13046]: I0308 03:15:01.278927 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/0.log" Mar 08 03:15:01.278999 master-0 kubenswrapper[13046]: I0308 03:15:01.279004 13046 generic.go:334] "Generic (PLEG): container finished" podID="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" containerID="34a41043128393510c095711912036e3de6953d35852c470aeee13ef6010b118" exitCode=1 Mar 08 03:15:01.839320 master-0 kubenswrapper[13046]: E0308 03:15:01.839159 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:02.062236 master-0 kubenswrapper[13046]: I0308 03:15:02.062180 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:02.289296 master-0 kubenswrapper[13046]: I0308 03:15:02.289221 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" exitCode=1 Mar 08 03:15:02.289296 master-0 kubenswrapper[13046]: I0308 03:15:02.289278 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675"} Mar 08 03:15:02.289553 master-0 kubenswrapper[13046]: I0308 03:15:02.289338 13046 scope.go:117] "RemoveContainer" containerID="3de8c0d66b1b813cc20da4f3103e2ffc852604be35456950aaa1dc2b778b09dc" Mar 08 03:15:02.289619 master-0 kubenswrapper[13046]: I0308 03:15:02.289539 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:02.292774 master-0 kubenswrapper[13046]: I0308 03:15:02.292727 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:02.292867 master-0 kubenswrapper[13046]: I0308 03:15:02.292785 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:02.292867 master-0 kubenswrapper[13046]: I0308 03:15:02.292805 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:02.293344 master-0 kubenswrapper[13046]: I0308 03:15:02.293304 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:02.293929 master-0 kubenswrapper[13046]: E0308 03:15:02.293878 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:02.512343 master-0 kubenswrapper[13046]: I0308 03:15:02.512137 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:02.515837 master-0 kubenswrapper[13046]: I0308 03:15:02.515766 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:02.516156 master-0 kubenswrapper[13046]: I0308 03:15:02.515843 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:02.516156 master-0 kubenswrapper[13046]: I0308 03:15:02.516047 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:02.516156 master-0 kubenswrapper[13046]: I0308 03:15:02.516079 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:02.517380 master-0 kubenswrapper[13046]: E0308 03:15:02.517301 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:03.062955 master-0 kubenswrapper[13046]: I0308 03:15:03.062827 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:03.301377 master-0 kubenswrapper[13046]: I0308 03:15:03.301202 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/0.log" Mar 08 03:15:03.301377 master-0 kubenswrapper[13046]: I0308 03:15:03.301288 13046 generic.go:334] "Generic (PLEG): container finished" podID="5a2c9576-f7bd-4ac5-a7fe-530f26642f97" containerID="27ab0f00e980c7d4d9fcf7e8c62f276ea49b975eb80fef82536adf6bfc74a796" exitCode=1 Mar 08 03:15:04.062529 master-0 kubenswrapper[13046]: I0308 03:15:04.062438 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:05.062657 master-0 kubenswrapper[13046]: I0308 03:15:05.062566 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:05.508899 master-0 kubenswrapper[13046]: I0308 03:15:05.508819 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:15:05.509194 master-0 kubenswrapper[13046]: I0308 03:15:05.509059 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:05.512620 master-0 kubenswrapper[13046]: I0308 03:15:05.512542 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:05.512761 master-0 kubenswrapper[13046]: I0308 03:15:05.512651 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:05.512761 master-0 kubenswrapper[13046]: I0308 03:15:05.512673 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:05.513365 master-0 kubenswrapper[13046]: I0308 03:15:05.513317 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:05.513767 master-0 kubenswrapper[13046]: E0308 03:15:05.513712 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:06.062752 master-0 kubenswrapper[13046]: I0308 03:15:06.062631 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:06.933740 master-0 kubenswrapper[13046]: E0308 03:15:06.933461 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:06.933740 master-0 kubenswrapper[13046]: E0308 03:15:06.933666 13046 event.go:307] "Unable to write event (retry limit exceeded!)" event="&Event{ObjectMeta:{master-0.189abf2bc685906f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,LastTimestamp:2026-03-08 03:13:18.058852463 +0000 UTC m=+0.137619680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:06.934764 master-0 kubenswrapper[13046]: E0308 03:15:06.934596 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:07.061950 master-0 kubenswrapper[13046]: I0308 03:15:07.061894 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:08.062260 master-0 kubenswrapper[13046]: I0308 03:15:08.062182 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:08.724714 master-0 kubenswrapper[13046]: E0308 03:15:08.724578 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:08.841995 master-0 kubenswrapper[13046]: E0308 03:15:08.841931 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:09.062530 master-0 kubenswrapper[13046]: I0308 03:15:09.062394 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:09.178106 master-0 kubenswrapper[13046]: I0308 03:15:09.177979 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:15:09.178106 master-0 kubenswrapper[13046]: I0308 03:15:09.178046 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:15:09.178342 master-0 kubenswrapper[13046]: I0308 03:15:09.178189 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:09.181667 master-0 kubenswrapper[13046]: I0308 03:15:09.181604 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:09.181789 master-0 kubenswrapper[13046]: I0308 03:15:09.181675 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:09.181789 master-0 kubenswrapper[13046]: I0308 03:15:09.181697 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:09.182352 master-0 kubenswrapper[13046]: I0308 03:15:09.182308 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:09.182857 master-0 kubenswrapper[13046]: E0308 03:15:09.182804 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:09.518331 master-0 kubenswrapper[13046]: I0308 03:15:09.518157 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:09.521826 master-0 kubenswrapper[13046]: I0308 03:15:09.521740 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:09.521826 master-0 kubenswrapper[13046]: I0308 03:15:09.521827 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:09.522045 master-0 kubenswrapper[13046]: I0308 03:15:09.521847 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:09.522045 master-0 kubenswrapper[13046]: I0308 03:15:09.521881 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:09.523036 master-0 kubenswrapper[13046]: E0308 03:15:09.522969 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:10.062049 master-0 kubenswrapper[13046]: I0308 03:15:10.061931 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:10.491215 master-0 kubenswrapper[13046]: I0308 03:15:10.491093 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:15:10.492121 master-0 kubenswrapper[13046]: I0308 03:15:10.491319 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:10.494171 master-0 kubenswrapper[13046]: I0308 03:15:10.494076 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:10.494171 master-0 kubenswrapper[13046]: I0308 03:15:10.494161 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:10.494596 master-0 kubenswrapper[13046]: I0308 03:15:10.494379 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:10.495095 master-0 kubenswrapper[13046]: I0308 03:15:10.495043 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:10.495463 master-0 kubenswrapper[13046]: E0308 03:15:10.495415 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:11.062743 master-0 kubenswrapper[13046]: I0308 03:15:11.062675 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:11.922708 master-0 kubenswrapper[13046]: W0308 03:15:11.922551 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:11.923571 master-0 kubenswrapper[13046]: E0308 03:15:11.922708 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:12.063007 master-0 kubenswrapper[13046]: I0308 03:15:12.062899 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:12.355295 master-0 kubenswrapper[13046]: I0308 03:15:12.355198 13046 generic.go:334] "Generic (PLEG): container finished" podID="d358134e-5625-492c-b4f7-460798631270" containerID="723615f545a9b912d96e2b20f5beb286b3ce93e38e0a010ef0152a7b0e0c1b1e" exitCode=0 Mar 08 03:15:13.067954 master-0 kubenswrapper[13046]: I0308 03:15:13.061797 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:13.580543 master-0 kubenswrapper[13046]: E0308 03:15:13.580328 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:14.062323 master-0 kubenswrapper[13046]: I0308 03:15:14.062204 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:15.062530 master-0 kubenswrapper[13046]: I0308 03:15:15.062417 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:15.844683 master-0 kubenswrapper[13046]: E0308 03:15:15.844575 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:16.062172 master-0 kubenswrapper[13046]: I0308 03:15:16.062069 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:16.524202 master-0 kubenswrapper[13046]: I0308 03:15:16.524072 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:16.527572 master-0 kubenswrapper[13046]: I0308 03:15:16.527523 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:16.527648 master-0 kubenswrapper[13046]: I0308 03:15:16.527582 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:16.527648 master-0 kubenswrapper[13046]: I0308 03:15:16.527600 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:16.527648 master-0 kubenswrapper[13046]: I0308 03:15:16.527631 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:16.529244 master-0 kubenswrapper[13046]: E0308 03:15:16.529182 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:17.061597 master-0 kubenswrapper[13046]: I0308 03:15:17.061478 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:18.063131 master-0 kubenswrapper[13046]: I0308 03:15:18.062129 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:18.397687 master-0 kubenswrapper[13046]: I0308 03:15:18.397608 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/0.log" Mar 08 03:15:18.397964 master-0 kubenswrapper[13046]: I0308 03:15:18.397691 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" containerID="643f3b1d5189adb625272097c9d23e7af0847cd627439de5de3ccca7ed7bb060" exitCode=1 Mar 08 03:15:18.725314 master-0 kubenswrapper[13046]: E0308 03:15:18.725115 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:18.919732 master-0 kubenswrapper[13046]: W0308 03:15:18.919618 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:18.919900 master-0 kubenswrapper[13046]: E0308 03:15:18.919785 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:19.062394 master-0 kubenswrapper[13046]: I0308 03:15:19.062228 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:19.406230 master-0 kubenswrapper[13046]: I0308 03:15:19.406143 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/0.log" Mar 08 03:15:19.407231 master-0 kubenswrapper[13046]: I0308 03:15:19.406269 13046 generic.go:334] "Generic (PLEG): container finished" podID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" containerID="115308b4e38a50965cda00a6f3da9ba63adca456afd5e8dd547096a0f49ebb12" exitCode=1 Mar 08 03:15:19.409221 master-0 kubenswrapper[13046]: I0308 03:15:19.409180 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/0.log" Mar 08 03:15:19.409980 master-0 kubenswrapper[13046]: I0308 03:15:19.409919 13046 generic.go:334] "Generic (PLEG): container finished" podID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" containerID="95cb1ab0414f6248676ceab0da8402d36a93f6fced2ddcec794373deb0d0db80" exitCode=1 Mar 08 03:15:20.062170 master-0 kubenswrapper[13046]: I0308 03:15:20.062074 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:21.063500 master-0 kubenswrapper[13046]: I0308 03:15:21.063410 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:22.061757 master-0 kubenswrapper[13046]: I0308 03:15:22.061676 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:22.847297 master-0 kubenswrapper[13046]: E0308 03:15:22.847194 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:23.062367 master-0 kubenswrapper[13046]: I0308 03:15:23.062261 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:23.530636 master-0 kubenswrapper[13046]: I0308 03:15:23.529862 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:23.532921 master-0 kubenswrapper[13046]: I0308 03:15:23.532814 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:23.532921 master-0 kubenswrapper[13046]: I0308 03:15:23.532882 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:23.532921 master-0 kubenswrapper[13046]: I0308 03:15:23.532904 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:23.533052 master-0 kubenswrapper[13046]: I0308 03:15:23.532938 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:23.533620 master-0 kubenswrapper[13046]: E0308 03:15:23.533574 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:23.583163 master-0 kubenswrapper[13046]: E0308 03:15:23.583004 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:24.061628 master-0 kubenswrapper[13046]: I0308 03:15:24.061536 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:25.062851 master-0 kubenswrapper[13046]: I0308 03:15:25.062768 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:25.241453 master-0 kubenswrapper[13046]: E0308 03:15:25.241402 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff: no such file or directory, extraDiskErr: Mar 08 03:15:25.268396 master-0 kubenswrapper[13046]: E0308 03:15:25.268308 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/97ce9afd1a3738e92755e26167d458e3d4b7794d7f92e1450094a6a0c26a5216/diff" to get inode usage: stat /var/lib/containers/storage/overlay/97ce9afd1a3738e92755e26167d458e3d4b7794d7f92e1450094a6a0c26a5216/diff: no such file or directory, extraDiskErr: Mar 08 03:15:26.062243 master-0 kubenswrapper[13046]: I0308 03:15:26.062081 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:26.118466 master-0 kubenswrapper[13046]: I0308 03:15:26.118362 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:26.121936 master-0 kubenswrapper[13046]: I0308 03:15:26.121851 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:26.122095 master-0 kubenswrapper[13046]: I0308 03:15:26.121954 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:26.122095 master-0 kubenswrapper[13046]: I0308 03:15:26.121982 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:26.123007 master-0 kubenswrapper[13046]: I0308 03:15:26.122941 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:26.123526 master-0 kubenswrapper[13046]: E0308 03:15:26.123423 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:27.062266 master-0 kubenswrapper[13046]: I0308 03:15:27.062148 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:27.406186 master-0 kubenswrapper[13046]: W0308 03:15:27.406067 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:27.406978 master-0 kubenswrapper[13046]: E0308 03:15:27.406264 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:28.062684 master-0 kubenswrapper[13046]: I0308 03:15:28.062601 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:28.725836 master-0 kubenswrapper[13046]: E0308 03:15:28.725724 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:29.062931 master-0 kubenswrapper[13046]: I0308 03:15:29.062760 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:29.848809 master-0 kubenswrapper[13046]: E0308 03:15:29.848687 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:30.061802 master-0 kubenswrapper[13046]: I0308 03:15:30.061722 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:30.534240 master-0 kubenswrapper[13046]: I0308 03:15:30.534159 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:30.537950 master-0 kubenswrapper[13046]: I0308 03:15:30.537903 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:30.538029 master-0 kubenswrapper[13046]: I0308 03:15:30.537952 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:30.538029 master-0 kubenswrapper[13046]: I0308 03:15:30.537969 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:30.538029 master-0 kubenswrapper[13046]: I0308 03:15:30.537998 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:30.539218 master-0 kubenswrapper[13046]: E0308 03:15:30.539129 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:31.062774 master-0 kubenswrapper[13046]: I0308 03:15:31.062676 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:32.061612 master-0 kubenswrapper[13046]: I0308 03:15:32.061531 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:33.062412 master-0 kubenswrapper[13046]: I0308 03:15:33.062305 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:33.584987 master-0 kubenswrapper[13046]: E0308 03:15:33.584769 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:34.062068 master-0 kubenswrapper[13046]: I0308 03:15:34.061947 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:35.062036 master-0 kubenswrapper[13046]: I0308 03:15:35.061883 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:36.062563 master-0 kubenswrapper[13046]: I0308 03:15:36.062405 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:36.850791 master-0 kubenswrapper[13046]: E0308 03:15:36.850669 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:37.062878 master-0 kubenswrapper[13046]: I0308 03:15:37.062784 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:37.539947 master-0 kubenswrapper[13046]: I0308 03:15:37.539896 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:37.543872 master-0 kubenswrapper[13046]: I0308 03:15:37.543810 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:37.544051 master-0 kubenswrapper[13046]: I0308 03:15:37.543937 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:37.544051 master-0 kubenswrapper[13046]: I0308 03:15:37.543959 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:37.544051 master-0 kubenswrapper[13046]: I0308 03:15:37.543990 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:37.545253 master-0 kubenswrapper[13046]: E0308 03:15:37.545179 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:38.067646 master-0 kubenswrapper[13046]: I0308 03:15:38.067508 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:38.118358 master-0 kubenswrapper[13046]: I0308 03:15:38.118306 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:38.122132 master-0 kubenswrapper[13046]: I0308 03:15:38.122068 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:38.122215 master-0 kubenswrapper[13046]: I0308 03:15:38.122151 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:38.122215 master-0 kubenswrapper[13046]: I0308 03:15:38.122186 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:38.123040 master-0 kubenswrapper[13046]: I0308 03:15:38.123002 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:38.123422 master-0 kubenswrapper[13046]: E0308 03:15:38.123378 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:38.726072 master-0 kubenswrapper[13046]: E0308 03:15:38.725961 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:39.062897 master-0 kubenswrapper[13046]: I0308 03:15:39.062674 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:40.062237 master-0 kubenswrapper[13046]: I0308 03:15:40.062165 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:40.118891 master-0 kubenswrapper[13046]: I0308 03:15:40.118801 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:40.125576 master-0 kubenswrapper[13046]: I0308 03:15:40.122868 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:40.125576 master-0 kubenswrapper[13046]: I0308 03:15:40.122932 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:40.125576 master-0 kubenswrapper[13046]: I0308 03:15:40.122970 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:41.069265 master-0 kubenswrapper[13046]: I0308 03:15:41.068958 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:41.855203 master-0 kubenswrapper[13046]: W0308 03:15:41.855036 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:41.855527 master-0 kubenswrapper[13046]: E0308 03:15:41.855202 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:42.061994 master-0 kubenswrapper[13046]: I0308 03:15:42.061875 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:43.062655 master-0 kubenswrapper[13046]: I0308 03:15:43.062535 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:43.117439 master-0 kubenswrapper[13046]: I0308 03:15:43.117366 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:43.120814 master-0 kubenswrapper[13046]: I0308 03:15:43.120741 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:43.120814 master-0 kubenswrapper[13046]: I0308 03:15:43.120813 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:43.121023 master-0 kubenswrapper[13046]: I0308 03:15:43.120837 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:43.587076 master-0 kubenswrapper[13046]: E0308 03:15:43.586856 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:43.852467 master-0 kubenswrapper[13046]: E0308 03:15:43.852382 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:44.062311 master-0 kubenswrapper[13046]: I0308 03:15:44.062220 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:44.546235 master-0 kubenswrapper[13046]: I0308 03:15:44.546124 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:44.549924 master-0 kubenswrapper[13046]: I0308 03:15:44.549854 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:44.549924 master-0 kubenswrapper[13046]: I0308 03:15:44.549916 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:44.550157 master-0 kubenswrapper[13046]: I0308 03:15:44.549939 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:44.550157 master-0 kubenswrapper[13046]: I0308 03:15:44.549975 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:44.551031 master-0 kubenswrapper[13046]: E0308 03:15:44.550912 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:45.062388 master-0 kubenswrapper[13046]: I0308 03:15:45.062292 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:46.062467 master-0 kubenswrapper[13046]: I0308 03:15:46.062398 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:47.062255 master-0 kubenswrapper[13046]: I0308 03:15:47.062148 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:48.062787 master-0 kubenswrapper[13046]: I0308 03:15:48.062679 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:48.726405 master-0 kubenswrapper[13046]: E0308 03:15:48.726347 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:49.062882 master-0 kubenswrapper[13046]: I0308 03:15:49.062725 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:49.117572 master-0 kubenswrapper[13046]: I0308 03:15:49.117476 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:49.121001 master-0 kubenswrapper[13046]: I0308 03:15:49.120946 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:49.121077 master-0 kubenswrapper[13046]: I0308 03:15:49.121014 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:49.121077 master-0 kubenswrapper[13046]: I0308 03:15:49.121039 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:49.121713 master-0 kubenswrapper[13046]: I0308 03:15:49.121671 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:15:49.122073 master-0 kubenswrapper[13046]: E0308 03:15:49.122015 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:15:50.062047 master-0 kubenswrapper[13046]: I0308 03:15:50.061962 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:50.854397 master-0 kubenswrapper[13046]: E0308 03:15:50.854257 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:51.062520 master-0 kubenswrapper[13046]: I0308 03:15:51.062340 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:51.551648 master-0 kubenswrapper[13046]: I0308 03:15:51.551563 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:51.555119 master-0 kubenswrapper[13046]: I0308 03:15:51.555069 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:51.555204 master-0 kubenswrapper[13046]: I0308 03:15:51.555129 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:51.555204 master-0 kubenswrapper[13046]: I0308 03:15:51.555147 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:51.555204 master-0 kubenswrapper[13046]: I0308 03:15:51.555176 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:51.556305 master-0 kubenswrapper[13046]: E0308 03:15:51.556247 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:52.062801 master-0 kubenswrapper[13046]: I0308 03:15:52.062670 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:53.062202 master-0 kubenswrapper[13046]: I0308 03:15:53.062100 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:53.588960 master-0 kubenswrapper[13046]: E0308 03:15:53.588754 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:15:54.061932 master-0 kubenswrapper[13046]: I0308 03:15:54.061725 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:55.062219 master-0 kubenswrapper[13046]: I0308 03:15:55.062130 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:56.062550 master-0 kubenswrapper[13046]: I0308 03:15:56.062392 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:57.062037 master-0 kubenswrapper[13046]: I0308 03:15:57.061960 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:57.855799 master-0 kubenswrapper[13046]: E0308 03:15:57.855719 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:15:57.942255 master-0 kubenswrapper[13046]: W0308 03:15:57.942130 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:57.942255 master-0 kubenswrapper[13046]: E0308 03:15:57.942243 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:58.062787 master-0 kubenswrapper[13046]: I0308 03:15:58.062668 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:58.556814 master-0 kubenswrapper[13046]: I0308 03:15:58.556731 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:15:58.560458 master-0 kubenswrapper[13046]: I0308 03:15:58.560373 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:15:58.560690 master-0 kubenswrapper[13046]: I0308 03:15:58.560464 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:15:58.560690 master-0 kubenswrapper[13046]: I0308 03:15:58.560540 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:15:58.560690 master-0 kubenswrapper[13046]: I0308 03:15:58.560581 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:15:58.561760 master-0 kubenswrapper[13046]: E0308 03:15:58.561685 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:15:58.727374 master-0 kubenswrapper[13046]: E0308 03:15:58.727323 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:15:58.881089 master-0 kubenswrapper[13046]: W0308 03:15:58.880980 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:15:58.881893 master-0 kubenswrapper[13046]: E0308 03:15:58.881107 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:15:59.062820 master-0 kubenswrapper[13046]: I0308 03:15:59.062717 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:00.062407 master-0 kubenswrapper[13046]: I0308 03:16:00.062315 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:01.062730 master-0 kubenswrapper[13046]: I0308 03:16:01.062602 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:02.061603 master-0 kubenswrapper[13046]: I0308 03:16:02.061476 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:03.062285 master-0 kubenswrapper[13046]: I0308 03:16:03.062209 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:03.117628 master-0 kubenswrapper[13046]: I0308 03:16:03.117589 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:03.121087 master-0 kubenswrapper[13046]: I0308 03:16:03.121054 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:03.121296 master-0 kubenswrapper[13046]: I0308 03:16:03.121273 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:03.121452 master-0 kubenswrapper[13046]: I0308 03:16:03.121432 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:03.122093 master-0 kubenswrapper[13046]: I0308 03:16:03.122066 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:16:03.122621 master-0 kubenswrapper[13046]: E0308 03:16:03.122581 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:03.590335 master-0 kubenswrapper[13046]: E0308 03:16:03.590134 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:04.062422 master-0 kubenswrapper[13046]: I0308 03:16:04.062267 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:04.857865 master-0 kubenswrapper[13046]: E0308 03:16:04.857775 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:05.062212 master-0 kubenswrapper[13046]: I0308 03:16:05.062116 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:05.562383 master-0 kubenswrapper[13046]: I0308 03:16:05.562271 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:05.565076 master-0 kubenswrapper[13046]: I0308 03:16:05.565014 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:05.565201 master-0 kubenswrapper[13046]: I0308 03:16:05.565092 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:05.565201 master-0 kubenswrapper[13046]: I0308 03:16:05.565115 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:05.565201 master-0 kubenswrapper[13046]: I0308 03:16:05.565153 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:05.566438 master-0 kubenswrapper[13046]: E0308 03:16:05.566378 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:06.062669 master-0 kubenswrapper[13046]: I0308 03:16:06.062579 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:07.061713 master-0 kubenswrapper[13046]: I0308 03:16:07.061670 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:07.118005 master-0 kubenswrapper[13046]: I0308 03:16:07.117943 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:07.122055 master-0 kubenswrapper[13046]: I0308 03:16:07.122016 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:07.122055 master-0 kubenswrapper[13046]: I0308 03:16:07.122051 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:07.122055 master-0 kubenswrapper[13046]: I0308 03:16:07.122061 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:08.062107 master-0 kubenswrapper[13046]: I0308 03:16:08.062013 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:08.727585 master-0 kubenswrapper[13046]: E0308 03:16:08.727464 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:09.062362 master-0 kubenswrapper[13046]: I0308 03:16:09.062130 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:10.061898 master-0 kubenswrapper[13046]: I0308 03:16:10.061798 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:11.062222 master-0 kubenswrapper[13046]: I0308 03:16:11.062110 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:11.859436 master-0 kubenswrapper[13046]: E0308 03:16:11.859328 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:12.062833 master-0 kubenswrapper[13046]: I0308 03:16:12.062734 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:12.567667 master-0 kubenswrapper[13046]: I0308 03:16:12.567567 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:12.571096 master-0 kubenswrapper[13046]: I0308 03:16:12.570999 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:12.571096 master-0 kubenswrapper[13046]: I0308 03:16:12.571055 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:12.571096 master-0 kubenswrapper[13046]: I0308 03:16:12.571068 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:12.571096 master-0 kubenswrapper[13046]: I0308 03:16:12.571096 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:12.572401 master-0 kubenswrapper[13046]: E0308 03:16:12.572314 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:12.976702 master-0 kubenswrapper[13046]: W0308 03:16:12.976562 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:12.976702 master-0 kubenswrapper[13046]: E0308 03:16:12.976685 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:16:13.062713 master-0 kubenswrapper[13046]: I0308 03:16:13.062606 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:13.592337 master-0 kubenswrapper[13046]: E0308 03:16:13.592098 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:14.063363 master-0 kubenswrapper[13046]: I0308 03:16:14.063140 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:15.062410 master-0 kubenswrapper[13046]: I0308 03:16:15.062276 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:15.118463 master-0 kubenswrapper[13046]: I0308 03:16:15.118345 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:15.122676 master-0 kubenswrapper[13046]: I0308 03:16:15.122605 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:15.122846 master-0 kubenswrapper[13046]: I0308 03:16:15.122689 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:15.122846 master-0 kubenswrapper[13046]: I0308 03:16:15.122746 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:15.123452 master-0 kubenswrapper[13046]: I0308 03:16:15.123388 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:16:15.123936 master-0 kubenswrapper[13046]: E0308 03:16:15.123881 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:16.062675 master-0 kubenswrapper[13046]: I0308 03:16:16.062585 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:17.062018 master-0 kubenswrapper[13046]: I0308 03:16:17.061939 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:18.062367 master-0 kubenswrapper[13046]: I0308 03:16:18.062265 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:18.711367 master-0 kubenswrapper[13046]: E0308 03:16:18.711279 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff: no such file or directory, extraDiskErr: Mar 08 03:16:18.728659 master-0 kubenswrapper[13046]: E0308 03:16:18.728596 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:18.861715 master-0 kubenswrapper[13046]: E0308 03:16:18.861592 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:19.062877 master-0 kubenswrapper[13046]: I0308 03:16:19.062678 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:19.573055 master-0 kubenswrapper[13046]: I0308 03:16:19.572943 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:19.577141 master-0 kubenswrapper[13046]: I0308 03:16:19.577101 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:19.577141 master-0 kubenswrapper[13046]: I0308 03:16:19.577145 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:19.577340 master-0 kubenswrapper[13046]: I0308 03:16:19.577161 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:19.577340 master-0 kubenswrapper[13046]: I0308 03:16:19.577186 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:19.578246 master-0 kubenswrapper[13046]: E0308 03:16:19.578165 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:20.062751 master-0 kubenswrapper[13046]: I0308 03:16:20.062637 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:21.062158 master-0 kubenswrapper[13046]: I0308 03:16:21.061942 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:22.062066 master-0 kubenswrapper[13046]: I0308 03:16:22.061903 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:23.062222 master-0 kubenswrapper[13046]: I0308 03:16:23.062119 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:23.593763 master-0 kubenswrapper[13046]: E0308 03:16:23.593564 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:24.061716 master-0 kubenswrapper[13046]: I0308 03:16:24.061515 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:25.063236 master-0 kubenswrapper[13046]: I0308 03:16:25.063122 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:25.863678 master-0 kubenswrapper[13046]: E0308 03:16:25.863551 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:26.062545 master-0 kubenswrapper[13046]: I0308 03:16:26.062403 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:26.579187 master-0 kubenswrapper[13046]: I0308 03:16:26.579063 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:26.583521 master-0 kubenswrapper[13046]: I0308 03:16:26.583445 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:26.583660 master-0 kubenswrapper[13046]: I0308 03:16:26.583543 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:26.583660 master-0 kubenswrapper[13046]: I0308 03:16:26.583562 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:26.583660 master-0 kubenswrapper[13046]: I0308 03:16:26.583593 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:26.584729 master-0 kubenswrapper[13046]: E0308 03:16:26.584656 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:27.062024 master-0 kubenswrapper[13046]: I0308 03:16:27.061921 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:28.062133 master-0 kubenswrapper[13046]: I0308 03:16:28.062076 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:28.729823 master-0 kubenswrapper[13046]: E0308 03:16:28.729721 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:29.062328 master-0 kubenswrapper[13046]: I0308 03:16:29.062163 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:29.119322 master-0 kubenswrapper[13046]: I0308 03:16:29.118462 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:29.122470 master-0 kubenswrapper[13046]: I0308 03:16:29.122380 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:29.122470 master-0 kubenswrapper[13046]: I0308 03:16:29.122458 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:29.122870 master-0 kubenswrapper[13046]: I0308 03:16:29.122507 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:29.123149 master-0 kubenswrapper[13046]: I0308 03:16:29.123109 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:16:29.882308 master-0 kubenswrapper[13046]: I0308 03:16:29.882243 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86"} Mar 08 03:16:29.882646 master-0 kubenswrapper[13046]: I0308 03:16:29.882421 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:29.885716 master-0 kubenswrapper[13046]: I0308 03:16:29.885674 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:29.885716 master-0 kubenswrapper[13046]: I0308 03:16:29.885720 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:29.885914 master-0 kubenswrapper[13046]: I0308 03:16:29.885735 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:30.061896 master-0 kubenswrapper[13046]: I0308 03:16:30.061829 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:30.063775 master-0 kubenswrapper[13046]: W0308 03:16:30.063695 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:30.064358 master-0 kubenswrapper[13046]: E0308 03:16:30.063779 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:16:30.490962 master-0 kubenswrapper[13046]: I0308 03:16:30.490854 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:16:30.892878 master-0 kubenswrapper[13046]: I0308 03:16:30.892804 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" exitCode=1 Mar 08 03:16:30.893153 master-0 kubenswrapper[13046]: I0308 03:16:30.892881 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86"} Mar 08 03:16:30.893153 master-0 kubenswrapper[13046]: I0308 03:16:30.892935 13046 scope.go:117] "RemoveContainer" containerID="686b46be8fc479f1c1eeb0616c06434b976527616220b3c5dae312927e3a3675" Mar 08 03:16:30.893153 master-0 kubenswrapper[13046]: I0308 03:16:30.892902 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:30.896432 master-0 kubenswrapper[13046]: I0308 03:16:30.896391 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:30.896739 master-0 kubenswrapper[13046]: I0308 03:16:30.896712 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:30.896978 master-0 kubenswrapper[13046]: I0308 03:16:30.896948 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:30.897818 master-0 kubenswrapper[13046]: I0308 03:16:30.897786 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:16:30.898470 master-0 kubenswrapper[13046]: E0308 03:16:30.898416 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:31.062593 master-0 kubenswrapper[13046]: I0308 03:16:31.062476 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:31.901735 master-0 kubenswrapper[13046]: I0308 03:16:31.901675 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:31.905106 master-0 kubenswrapper[13046]: I0308 03:16:31.905055 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:31.905233 master-0 kubenswrapper[13046]: I0308 03:16:31.905206 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:31.905291 master-0 kubenswrapper[13046]: I0308 03:16:31.905276 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:31.905926 master-0 kubenswrapper[13046]: I0308 03:16:31.905890 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:16:31.913062 master-0 kubenswrapper[13046]: E0308 03:16:31.913006 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:32.062680 master-0 kubenswrapper[13046]: I0308 03:16:32.062542 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:32.865142 master-0 kubenswrapper[13046]: E0308 03:16:32.865024 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:33.062985 master-0 kubenswrapper[13046]: I0308 03:16:33.062882 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:33.585744 master-0 kubenswrapper[13046]: I0308 03:16:33.585653 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:33.589315 master-0 kubenswrapper[13046]: I0308 03:16:33.589260 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:33.589430 master-0 kubenswrapper[13046]: I0308 03:16:33.589327 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:33.589430 master-0 kubenswrapper[13046]: I0308 03:16:33.589346 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:33.589430 master-0 kubenswrapper[13046]: I0308 03:16:33.589379 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:33.590445 master-0 kubenswrapper[13046]: E0308 03:16:33.590375 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:33.595597 master-0 kubenswrapper[13046]: E0308 03:16:33.595394 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:34.061523 master-0 kubenswrapper[13046]: I0308 03:16:34.061432 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:35.062577 master-0 kubenswrapper[13046]: I0308 03:16:35.062450 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:35.509245 master-0 kubenswrapper[13046]: I0308 03:16:35.509189 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:16:35.509751 master-0 kubenswrapper[13046]: I0308 03:16:35.509724 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:35.513319 master-0 kubenswrapper[13046]: I0308 03:16:35.513252 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:35.513449 master-0 kubenswrapper[13046]: I0308 03:16:35.513345 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:35.513449 master-0 kubenswrapper[13046]: I0308 03:16:35.513379 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:35.514202 master-0 kubenswrapper[13046]: I0308 03:16:35.514152 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:16:35.514627 master-0 kubenswrapper[13046]: E0308 03:16:35.514574 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:36.062142 master-0 kubenswrapper[13046]: I0308 03:16:36.062086 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:37.062189 master-0 kubenswrapper[13046]: I0308 03:16:37.062092 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:37.923616 master-0 kubenswrapper[13046]: W0308 03:16:37.923450 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:37.923616 master-0 kubenswrapper[13046]: E0308 03:16:37.923594 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:16:38.061475 master-0 kubenswrapper[13046]: I0308 03:16:38.061358 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:38.730224 master-0 kubenswrapper[13046]: E0308 03:16:38.730150 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:39.062053 master-0 kubenswrapper[13046]: I0308 03:16:39.061824 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:39.177923 master-0 kubenswrapper[13046]: I0308 03:16:39.177750 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:16:39.177923 master-0 kubenswrapper[13046]: I0308 03:16:39.177922 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:16:39.178298 master-0 kubenswrapper[13046]: I0308 03:16:39.178075 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:39.188978 master-0 kubenswrapper[13046]: I0308 03:16:39.188907 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:39.189149 master-0 kubenswrapper[13046]: I0308 03:16:39.188982 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:39.189149 master-0 kubenswrapper[13046]: I0308 03:16:39.189008 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:39.189655 master-0 kubenswrapper[13046]: I0308 03:16:39.189615 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:16:39.190191 master-0 kubenswrapper[13046]: E0308 03:16:39.190138 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:39.867191 master-0 kubenswrapper[13046]: E0308 03:16:39.867085 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:40.062976 master-0 kubenswrapper[13046]: I0308 03:16:40.062851 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:40.591659 master-0 kubenswrapper[13046]: I0308 03:16:40.591563 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:40.594079 master-0 kubenswrapper[13046]: I0308 03:16:40.594030 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:40.594079 master-0 kubenswrapper[13046]: I0308 03:16:40.594073 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:40.594079 master-0 kubenswrapper[13046]: I0308 03:16:40.594084 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:40.594321 master-0 kubenswrapper[13046]: I0308 03:16:40.594133 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:40.595845 master-0 kubenswrapper[13046]: E0308 03:16:40.595765 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:41.063064 master-0 kubenswrapper[13046]: I0308 03:16:41.062834 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:41.118010 master-0 kubenswrapper[13046]: I0308 03:16:41.117923 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:41.121354 master-0 kubenswrapper[13046]: I0308 03:16:41.121291 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:41.121478 master-0 kubenswrapper[13046]: I0308 03:16:41.121362 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:41.121478 master-0 kubenswrapper[13046]: I0308 03:16:41.121381 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:42.062674 master-0 kubenswrapper[13046]: I0308 03:16:42.062573 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:43.062612 master-0 kubenswrapper[13046]: I0308 03:16:43.062445 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:43.597877 master-0 kubenswrapper[13046]: E0308 03:16:43.597670 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:44.062632 master-0 kubenswrapper[13046]: I0308 03:16:44.062412 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:45.062300 master-0 kubenswrapper[13046]: I0308 03:16:45.062189 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:46.062268 master-0 kubenswrapper[13046]: I0308 03:16:46.062147 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:46.868952 master-0 kubenswrapper[13046]: E0308 03:16:46.868845 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:47.062390 master-0 kubenswrapper[13046]: I0308 03:16:47.062252 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:47.596349 master-0 kubenswrapper[13046]: I0308 03:16:47.596248 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:47.607467 master-0 kubenswrapper[13046]: I0308 03:16:47.607380 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:47.607467 master-0 kubenswrapper[13046]: I0308 03:16:47.607458 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:47.607834 master-0 kubenswrapper[13046]: I0308 03:16:47.607547 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:47.607834 master-0 kubenswrapper[13046]: I0308 03:16:47.607588 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:47.609037 master-0 kubenswrapper[13046]: E0308 03:16:47.608955 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:48.062562 master-0 kubenswrapper[13046]: I0308 03:16:48.062337 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:48.731234 master-0 kubenswrapper[13046]: E0308 03:16:48.731161 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:49.063063 master-0 kubenswrapper[13046]: I0308 03:16:49.062747 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:50.062658 master-0 kubenswrapper[13046]: I0308 03:16:50.062546 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:51.008342 master-0 kubenswrapper[13046]: W0308 03:16:51.008199 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:51.009180 master-0 kubenswrapper[13046]: E0308 03:16:51.008344 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:16:51.062297 master-0 kubenswrapper[13046]: I0308 03:16:51.062185 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:52.062338 master-0 kubenswrapper[13046]: I0308 03:16:52.062235 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:52.118351 master-0 kubenswrapper[13046]: I0308 03:16:52.118238 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:52.121655 master-0 kubenswrapper[13046]: I0308 03:16:52.121592 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:52.121767 master-0 kubenswrapper[13046]: I0308 03:16:52.121669 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:52.121767 master-0 kubenswrapper[13046]: I0308 03:16:52.121690 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:52.122255 master-0 kubenswrapper[13046]: I0308 03:16:52.122205 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:16:52.122666 master-0 kubenswrapper[13046]: E0308 03:16:52.122612 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:16:52.665535 master-0 kubenswrapper[13046]: W0308 03:16:52.665374 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:52.665837 master-0 kubenswrapper[13046]: E0308 03:16:52.665553 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:16:53.062291 master-0 kubenswrapper[13046]: I0308 03:16:53.062077 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:53.599780 master-0 kubenswrapper[13046]: E0308 03:16:53.599571 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:53.600691 master-0 kubenswrapper[13046]: E0308 03:16:53.599736 13046 event.go:307] "Unable to write event (retry limit exceeded!)" event="&Event{ObjectMeta:{master-0.189abf2beb9e6d25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,LastTimestamp:2026-03-08 03:13:18.681238821 +0000 UTC m=+0.760006048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:53.600691 master-0 kubenswrapper[13046]: E0308 03:16:53.600318 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9ee74c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,LastTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:16:53.870988 master-0 kubenswrapper[13046]: E0308 03:16:53.870777 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:16:54.062162 master-0 kubenswrapper[13046]: I0308 03:16:54.062063 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:54.610032 master-0 kubenswrapper[13046]: I0308 03:16:54.609933 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:54.613429 master-0 kubenswrapper[13046]: I0308 03:16:54.613336 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:54.613429 master-0 kubenswrapper[13046]: I0308 03:16:54.613421 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:54.613671 master-0 kubenswrapper[13046]: I0308 03:16:54.613446 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:54.613671 master-0 kubenswrapper[13046]: I0308 03:16:54.613526 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:16:54.614599 master-0 kubenswrapper[13046]: E0308 03:16:54.614473 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:16:55.062534 master-0 kubenswrapper[13046]: I0308 03:16:55.062358 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:56.062525 master-0 kubenswrapper[13046]: I0308 03:16:56.062387 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:57.062130 master-0 kubenswrapper[13046]: I0308 03:16:57.062014 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:57.118395 master-0 kubenswrapper[13046]: I0308 03:16:57.118295 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:16:57.121961 master-0 kubenswrapper[13046]: I0308 03:16:57.121885 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:16:57.121961 master-0 kubenswrapper[13046]: I0308 03:16:57.121959 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:16:57.122169 master-0 kubenswrapper[13046]: I0308 03:16:57.121982 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:16:58.062822 master-0 kubenswrapper[13046]: I0308 03:16:58.062742 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:16:58.732200 master-0 kubenswrapper[13046]: E0308 03:16:58.732115 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:16:59.062599 master-0 kubenswrapper[13046]: I0308 03:16:59.062377 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:00.061997 master-0 kubenswrapper[13046]: I0308 03:17:00.061897 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:00.885321 master-0 kubenswrapper[13046]: E0308 03:17:00.885159 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:17:01.062374 master-0 kubenswrapper[13046]: I0308 03:17:01.062252 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:01.615700 master-0 kubenswrapper[13046]: I0308 03:17:01.615570 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:01.619194 master-0 kubenswrapper[13046]: I0308 03:17:01.619114 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:01.619194 master-0 kubenswrapper[13046]: I0308 03:17:01.619178 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:01.619194 master-0 kubenswrapper[13046]: I0308 03:17:01.619195 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:01.619516 master-0 kubenswrapper[13046]: I0308 03:17:01.619226 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:01.620747 master-0 kubenswrapper[13046]: E0308 03:17:01.620643 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:17:02.062651 master-0 kubenswrapper[13046]: I0308 03:17:02.062418 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:02.204059 master-0 kubenswrapper[13046]: E0308 03:17:02.203896 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9ee74c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,LastTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:17:03.062153 master-0 kubenswrapper[13046]: I0308 03:17:03.062052 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:03.117699 master-0 kubenswrapper[13046]: I0308 03:17:03.117645 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:03.120900 master-0 kubenswrapper[13046]: I0308 03:17:03.120838 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:03.121050 master-0 kubenswrapper[13046]: I0308 03:17:03.120911 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:03.121050 master-0 kubenswrapper[13046]: I0308 03:17:03.120935 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:03.121595 master-0 kubenswrapper[13046]: I0308 03:17:03.121554 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:03.121936 master-0 kubenswrapper[13046]: E0308 03:17:03.121884 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:04.062917 master-0 kubenswrapper[13046]: I0308 03:17:04.062851 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:04.294248 master-0 kubenswrapper[13046]: W0308 03:17:04.294111 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:04.294248 master-0 kubenswrapper[13046]: E0308 03:17:04.294228 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:17:05.062675 master-0 kubenswrapper[13046]: I0308 03:17:05.062579 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:06.062090 master-0 kubenswrapper[13046]: I0308 03:17:06.061984 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:07.061054 master-0 kubenswrapper[13046]: I0308 03:17:07.060980 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:07.886618 master-0 kubenswrapper[13046]: E0308 03:17:07.886469 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:17:08.061843 master-0 kubenswrapper[13046]: I0308 03:17:08.061726 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:08.621526 master-0 kubenswrapper[13046]: I0308 03:17:08.621402 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:08.624905 master-0 kubenswrapper[13046]: I0308 03:17:08.624852 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:08.625023 master-0 kubenswrapper[13046]: I0308 03:17:08.624910 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:08.625023 master-0 kubenswrapper[13046]: I0308 03:17:08.624930 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:08.625023 master-0 kubenswrapper[13046]: I0308 03:17:08.624961 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:08.626261 master-0 kubenswrapper[13046]: E0308 03:17:08.626181 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:17:08.733080 master-0 kubenswrapper[13046]: E0308 03:17:08.732971 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:09.062836 master-0 kubenswrapper[13046]: I0308 03:17:09.062641 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:10.062312 master-0 kubenswrapper[13046]: I0308 03:17:10.062204 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:11.062447 master-0 kubenswrapper[13046]: I0308 03:17:11.062345 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:12.062082 master-0 kubenswrapper[13046]: I0308 03:17:12.061980 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:12.205963 master-0 kubenswrapper[13046]: E0308 03:17:12.205732 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9ee74c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,LastTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:17:13.063139 master-0 kubenswrapper[13046]: I0308 03:17:13.062908 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:13.157752 master-0 kubenswrapper[13046]: I0308 03:17:13.157621 13046 generic.go:334] "Generic (PLEG): container finished" podID="e74c8bb2-e063-4b60-b3fe-651aa534d029" containerID="5cb8f3acbb7aa9ec545c1b8e4b064d16cbafd48b223783d78db54ee94e2fb56a" exitCode=0 Mar 08 03:17:14.061861 master-0 kubenswrapper[13046]: I0308 03:17:14.061754 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:14.888020 master-0 kubenswrapper[13046]: E0308 03:17:14.887896 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:17:15.063192 master-0 kubenswrapper[13046]: I0308 03:17:15.063076 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:15.117831 master-0 kubenswrapper[13046]: I0308 03:17:15.117743 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:15.121230 master-0 kubenswrapper[13046]: I0308 03:17:15.121173 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:15.121521 master-0 kubenswrapper[13046]: I0308 03:17:15.121251 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:15.121521 master-0 kubenswrapper[13046]: I0308 03:17:15.121275 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:15.121871 master-0 kubenswrapper[13046]: I0308 03:17:15.121821 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:15.122228 master-0 kubenswrapper[13046]: E0308 03:17:15.122168 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:15.627070 master-0 kubenswrapper[13046]: I0308 03:17:15.626997 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:15.630536 master-0 kubenswrapper[13046]: I0308 03:17:15.630460 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:15.630684 master-0 kubenswrapper[13046]: I0308 03:17:15.630563 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:15.630684 master-0 kubenswrapper[13046]: I0308 03:17:15.630583 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:15.630684 master-0 kubenswrapper[13046]: I0308 03:17:15.630617 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:15.631718 master-0 kubenswrapper[13046]: E0308 03:17:15.631654 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:17:16.062470 master-0 kubenswrapper[13046]: I0308 03:17:16.062314 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:17.062771 master-0 kubenswrapper[13046]: I0308 03:17:17.062687 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:18.061808 master-0 kubenswrapper[13046]: I0308 03:17:18.061670 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:18.733652 master-0 kubenswrapper[13046]: E0308 03:17:18.733572 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:19.063145 master-0 kubenswrapper[13046]: I0308 03:17:19.062918 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:19.143692 master-0 kubenswrapper[13046]: E0308 03:17:19.143585 13046 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Mar 08 03:17:19.144039 master-0 kubenswrapper[13046]: E0308 03:17:19.143717 13046 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:19.144039 master-0 kubenswrapper[13046]: E0308 03:17:19.143747 13046 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:19.144039 master-0 kubenswrapper[13046]: E0308 03:17:19.143808 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"kube-apiserver-master-0_openshift-kube-apiserver(cdcecc61ff5eeb08bd2a3ac12599e4f9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"kube-apiserver-master-0_openshift-kube-apiserver(cdcecc61ff5eeb08bd2a3ac12599e4f9)\\\": rpc error: code = DeadlineExceeded desc = context deadline exceeded\"" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" Mar 08 03:17:19.146125 master-0 kubenswrapper[13046]: E0308 03:17:19.146084 13046 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Mar 08 03:17:19.146125 master-0 kubenswrapper[13046]: E0308 03:17:19.146131 13046 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:17:19.146352 master-0 kubenswrapper[13046]: E0308 03:17:19.146152 13046 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:17:19.146352 master-0 kubenswrapper[13046]: E0308 03:17:19.146204 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"kube-apiserver-startup-monitor-master-0_openshift-kube-apiserver(f417e14665db2ffffa887ce21c9ff0ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"kube-apiserver-startup-monitor-master-0_openshift-kube-apiserver(f417e14665db2ffffa887ce21c9ff0ed)\\\": rpc error: code = DeadlineExceeded desc = context deadline exceeded\"" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" Mar 08 03:17:20.063034 master-0 kubenswrapper[13046]: I0308 03:17:20.062801 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:21.062303 master-0 kubenswrapper[13046]: I0308 03:17:21.062194 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:21.889348 master-0 kubenswrapper[13046]: E0308 03:17:21.889226 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:17:22.062626 master-0 kubenswrapper[13046]: I0308 03:17:22.062470 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:22.208252 master-0 kubenswrapper[13046]: E0308 03:17:22.207959 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9ee74c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,LastTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:17:22.632693 master-0 kubenswrapper[13046]: I0308 03:17:22.632578 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:22.636584 master-0 kubenswrapper[13046]: I0308 03:17:22.636458 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:22.636584 master-0 kubenswrapper[13046]: I0308 03:17:22.636571 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:22.636584 master-0 kubenswrapper[13046]: I0308 03:17:22.636592 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:22.636972 master-0 kubenswrapper[13046]: I0308 03:17:22.636627 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:22.637782 master-0 kubenswrapper[13046]: E0308 03:17:22.637701 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:17:23.062542 master-0 kubenswrapper[13046]: I0308 03:17:23.062392 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:23.117997 master-0 kubenswrapper[13046]: I0308 03:17:23.117898 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:23.121392 master-0 kubenswrapper[13046]: I0308 03:17:23.121307 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:23.121392 master-0 kubenswrapper[13046]: I0308 03:17:23.121370 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:23.121392 master-0 kubenswrapper[13046]: I0308 03:17:23.121385 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:24.062746 master-0 kubenswrapper[13046]: I0308 03:17:24.062618 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:25.062572 master-0 kubenswrapper[13046]: I0308 03:17:25.062465 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:25.242813 master-0 kubenswrapper[13046]: E0308 03:17:25.242710 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff: no such file or directory, extraDiskErr: Mar 08 03:17:25.269167 master-0 kubenswrapper[13046]: E0308 03:17:25.269096 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/97ce9afd1a3738e92755e26167d458e3d4b7794d7f92e1450094a6a0c26a5216/diff" to get inode usage: stat /var/lib/containers/storage/overlay/97ce9afd1a3738e92755e26167d458e3d4b7794d7f92e1450094a6a0c26a5216/diff: no such file or directory, extraDiskErr: Mar 08 03:17:26.062236 master-0 kubenswrapper[13046]: I0308 03:17:26.062156 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:26.117819 master-0 kubenswrapper[13046]: I0308 03:17:26.117760 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:26.121313 master-0 kubenswrapper[13046]: I0308 03:17:26.121241 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:26.121313 master-0 kubenswrapper[13046]: I0308 03:17:26.121299 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:26.121621 master-0 kubenswrapper[13046]: I0308 03:17:26.121322 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:26.121917 master-0 kubenswrapper[13046]: I0308 03:17:26.121870 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:26.122227 master-0 kubenswrapper[13046]: E0308 03:17:26.122191 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:27.061915 master-0 kubenswrapper[13046]: I0308 03:17:27.061825 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:28.061981 master-0 kubenswrapper[13046]: I0308 03:17:28.061898 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:28.734315 master-0 kubenswrapper[13046]: E0308 03:17:28.734225 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:28.891297 master-0 kubenswrapper[13046]: E0308 03:17:28.891231 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:17:29.062041 master-0 kubenswrapper[13046]: I0308 03:17:29.061872 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:29.415555 master-0 kubenswrapper[13046]: W0308 03:17:29.415398 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:29.415821 master-0 kubenswrapper[13046]: E0308 03:17:29.415565 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:17:29.638855 master-0 kubenswrapper[13046]: I0308 03:17:29.638701 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:29.641990 master-0 kubenswrapper[13046]: I0308 03:17:29.641918 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:29.642121 master-0 kubenswrapper[13046]: I0308 03:17:29.642005 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:29.642121 master-0 kubenswrapper[13046]: I0308 03:17:29.642023 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:29.642121 master-0 kubenswrapper[13046]: I0308 03:17:29.642072 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:29.643239 master-0 kubenswrapper[13046]: E0308 03:17:29.643160 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 08 03:17:30.062891 master-0 kubenswrapper[13046]: I0308 03:17:30.062776 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:30.270140 master-0 kubenswrapper[13046]: I0308 03:17:30.269990 13046 generic.go:334] "Generic (PLEG): container finished" podID="af653e87-ce5f-4f1a-a20d-233c563694ba" containerID="6c59d77b77a1f89b306ddf4cc0f2bd1da0d815a10de107029f05b136ace17ea9" exitCode=0 Mar 08 03:17:31.062974 master-0 kubenswrapper[13046]: I0308 03:17:31.062864 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:31.118309 master-0 kubenswrapper[13046]: I0308 03:17:31.118224 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:17:31.118309 master-0 kubenswrapper[13046]: I0308 03:17:31.118310 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:31.122085 master-0 kubenswrapper[13046]: I0308 03:17:31.121994 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:31.122210 master-0 kubenswrapper[13046]: I0308 03:17:31.122117 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:31.122210 master-0 kubenswrapper[13046]: I0308 03:17:31.122138 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:31.122762 master-0 kubenswrapper[13046]: I0308 03:17:31.122683 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:17:31.148056 master-0 kubenswrapper[13046]: W0308 03:17:31.147980 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf417e14665db2ffffa887ce21c9ff0ed.slice/crio-e79b5be5dfd00c6ca73552c6e91b58842748f1a0b0425868e36c91580928c831 WatchSource:0}: Error finding container e79b5be5dfd00c6ca73552c6e91b58842748f1a0b0425868e36c91580928c831: Status 404 returned error can't find the container with id e79b5be5dfd00c6ca73552c6e91b58842748f1a0b0425868e36c91580928c831 Mar 08 03:17:31.278085 master-0 kubenswrapper[13046]: I0308 03:17:31.277998 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"e79b5be5dfd00c6ca73552c6e91b58842748f1a0b0425868e36c91580928c831"} Mar 08 03:17:32.061825 master-0 kubenswrapper[13046]: I0308 03:17:32.061757 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:32.210559 master-0 kubenswrapper[13046]: E0308 03:17:32.210338 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189abf2beb9ee74c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,LastTimestamp:2026-03-08 03:13:18.681270092 +0000 UTC m=+0.760037309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:17:32.287079 master-0 kubenswrapper[13046]: I0308 03:17:32.286962 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0"} Mar 08 03:17:32.287079 master-0 kubenswrapper[13046]: I0308 03:17:32.287044 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:32.290221 master-0 kubenswrapper[13046]: I0308 03:17:32.290182 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:32.290221 master-0 kubenswrapper[13046]: I0308 03:17:32.290214 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:32.290221 master-0 kubenswrapper[13046]: I0308 03:17:32.290224 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:33.062587 master-0 kubenswrapper[13046]: I0308 03:17:33.062431 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:33.294040 master-0 kubenswrapper[13046]: I0308 03:17:33.293943 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:33.297131 master-0 kubenswrapper[13046]: I0308 03:17:33.297064 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:33.297131 master-0 kubenswrapper[13046]: I0308 03:17:33.297124 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:33.297378 master-0 kubenswrapper[13046]: I0308 03:17:33.297142 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:34.062095 master-0 kubenswrapper[13046]: I0308 03:17:34.062004 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:34.117734 master-0 kubenswrapper[13046]: I0308 03:17:34.117640 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:34.117734 master-0 kubenswrapper[13046]: I0308 03:17:34.117728 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:34.121208 master-0 kubenswrapper[13046]: I0308 03:17:34.121144 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:34.121208 master-0 kubenswrapper[13046]: I0308 03:17:34.121202 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:34.121208 master-0 kubenswrapper[13046]: I0308 03:17:34.121216 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:34.121838 master-0 kubenswrapper[13046]: I0308 03:17:34.121798 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:34.134851 master-0 kubenswrapper[13046]: W0308 03:17:34.134717 13046 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:34.134851 master-0 kubenswrapper[13046]: E0308 03:17:34.134801 13046 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 08 03:17:34.146211 master-0 kubenswrapper[13046]: W0308 03:17:34.146136 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-82347536b51849df89c980fbc8cd08e08fc41874e74a041a3ec97fd0e04bcf92 WatchSource:0}: Error finding container 82347536b51849df89c980fbc8cd08e08fc41874e74a041a3ec97fd0e04bcf92: Status 404 returned error can't find the container with id 82347536b51849df89c980fbc8cd08e08fc41874e74a041a3ec97fd0e04bcf92 Mar 08 03:17:34.310579 master-0 kubenswrapper[13046]: I0308 03:17:34.310454 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"82347536b51849df89c980fbc8cd08e08fc41874e74a041a3ec97fd0e04bcf92"} Mar 08 03:17:35.062682 master-0 kubenswrapper[13046]: I0308 03:17:35.062437 13046 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 08 03:17:35.321105 master-0 kubenswrapper[13046]: I0308 03:17:35.320946 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/0.log" Mar 08 03:17:35.321877 master-0 kubenswrapper[13046]: I0308 03:17:35.321604 13046 generic.go:334] "Generic (PLEG): container finished" podID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" containerID="67b6371de1e40f11492bdbedad65b4bb4c5dafeb7f94b97c8372fcadf4c1308d" exitCode=1 Mar 08 03:17:35.324234 master-0 kubenswrapper[13046]: I0308 03:17:35.324185 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca" exitCode=0 Mar 08 03:17:35.324345 master-0 kubenswrapper[13046]: I0308 03:17:35.324254 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca"} Mar 08 03:17:35.324607 master-0 kubenswrapper[13046]: I0308 03:17:35.324558 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:35.326311 master-0 kubenswrapper[13046]: I0308 03:17:35.326280 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="5b012a30bc2b3cc713592db2b85b5cc01f37c9b84a3768f1b8abdd21b2236990" exitCode=0 Mar 08 03:17:35.326417 master-0 kubenswrapper[13046]: I0308 03:17:35.326318 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"5b012a30bc2b3cc713592db2b85b5cc01f37c9b84a3768f1b8abdd21b2236990"} Mar 08 03:17:35.326519 master-0 kubenswrapper[13046]: I0308 03:17:35.326449 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:35.327904 master-0 kubenswrapper[13046]: I0308 03:17:35.327859 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:35.328024 master-0 kubenswrapper[13046]: I0308 03:17:35.327926 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:35.328024 master-0 kubenswrapper[13046]: I0308 03:17:35.327946 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:35.328569 master-0 kubenswrapper[13046]: I0308 03:17:35.328454 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:35.328569 master-0 kubenswrapper[13046]: I0308 03:17:35.328545 13046 scope.go:117] "RemoveContainer" containerID="5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca" Mar 08 03:17:35.330263 master-0 kubenswrapper[13046]: I0308 03:17:35.330219 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:35.330374 master-0 kubenswrapper[13046]: I0308 03:17:35.330273 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:35.330374 master-0 kubenswrapper[13046]: I0308 03:17:35.330297 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:35.332349 master-0 kubenswrapper[13046]: I0308 03:17:35.332188 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:35.335527 master-0 kubenswrapper[13046]: I0308 03:17:35.335457 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:35.335652 master-0 kubenswrapper[13046]: I0308 03:17:35.335548 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:35.335652 master-0 kubenswrapper[13046]: I0308 03:17:35.335574 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:35.627420 master-0 kubenswrapper[13046]: E0308 03:17:35.627358 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:35.836186 master-0 kubenswrapper[13046]: I0308 03:17:35.836037 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:17:36.347377 master-0 kubenswrapper[13046]: I0308 03:17:36.345996 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a9e780a9eb2bd513e2633ca7eb901eb0af2ab3ee1ed5af3a95bd9d57edb15b71"} Mar 08 03:17:36.347377 master-0 kubenswrapper[13046]: I0308 03:17:36.346180 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:36.350306 master-0 kubenswrapper[13046]: I0308 03:17:36.350070 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:36.350306 master-0 kubenswrapper[13046]: I0308 03:17:36.350130 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:36.350306 master-0 kubenswrapper[13046]: I0308 03:17:36.350138 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:36.350609 master-0 kubenswrapper[13046]: I0308 03:17:36.350455 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:36.350679 master-0 kubenswrapper[13046]: E0308 03:17:36.350649 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:36.358549 master-0 kubenswrapper[13046]: I0308 03:17:36.357452 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"72517ac9670d34df16c03c6560b187788f7f0baf22e95a4ce45b7d58900f22fc"} Mar 08 03:17:36.358549 master-0 kubenswrapper[13046]: I0308 03:17:36.357521 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"352601263b2ea037568f79eb419fdd95756531630d85b95824eedb3557887aab"} Mar 08 03:17:36.358549 master-0 kubenswrapper[13046]: I0308 03:17:36.357538 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"9b42a4f3cd06b596c0776aff41c17ae083724aac3b4bd87b457ee3501b6408f8"} Mar 08 03:17:36.643789 master-0 kubenswrapper[13046]: I0308 03:17:36.643722 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:36.648104 master-0 kubenswrapper[13046]: I0308 03:17:36.646368 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:36.648104 master-0 kubenswrapper[13046]: I0308 03:17:36.646412 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:36.648104 master-0 kubenswrapper[13046]: I0308 03:17:36.646421 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:36.648104 master-0 kubenswrapper[13046]: I0308 03:17:36.646441 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:37.377506 master-0 kubenswrapper[13046]: I0308 03:17:37.377433 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"ceb0c94c04f56c2553f651c28f375c02ae1b955b20e010419230a4a5aff01519"} Mar 08 03:17:37.377506 master-0 kubenswrapper[13046]: I0308 03:17:37.377468 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:37.377506 master-0 kubenswrapper[13046]: I0308 03:17:37.377502 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:37.378144 master-0 kubenswrapper[13046]: I0308 03:17:37.377503 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"8267a2d84f723ff694dc31049976c14f450972f451107ec8a3714b4067dbd5aa"} Mar 08 03:17:37.380376 master-0 kubenswrapper[13046]: I0308 03:17:37.380349 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:37.380465 master-0 kubenswrapper[13046]: I0308 03:17:37.380380 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:37.380465 master-0 kubenswrapper[13046]: I0308 03:17:37.380352 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:37.380465 master-0 kubenswrapper[13046]: I0308 03:17:37.380416 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:37.380465 master-0 kubenswrapper[13046]: I0308 03:17:37.380427 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:37.380635 master-0 kubenswrapper[13046]: I0308 03:17:37.380389 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:37.380933 master-0 kubenswrapper[13046]: I0308 03:17:37.380898 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:37.381170 master-0 kubenswrapper[13046]: E0308 03:17:37.381129 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:38.387247 master-0 kubenswrapper[13046]: I0308 03:17:38.387188 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:38.387790 master-0 kubenswrapper[13046]: I0308 03:17:38.387329 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:38.390768 master-0 kubenswrapper[13046]: I0308 03:17:38.390735 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:38.390864 master-0 kubenswrapper[13046]: I0308 03:17:38.390786 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:38.390864 master-0 kubenswrapper[13046]: I0308 03:17:38.390799 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:38.735365 master-0 kubenswrapper[13046]: E0308 03:17:38.735211 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:39.219349 master-0 kubenswrapper[13046]: I0308 03:17:39.219263 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:39.392378 master-0 kubenswrapper[13046]: I0308 03:17:39.392315 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:39.395997 master-0 kubenswrapper[13046]: I0308 03:17:39.395949 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:39.396144 master-0 kubenswrapper[13046]: I0308 03:17:39.396038 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:39.396144 master-0 kubenswrapper[13046]: I0308 03:17:39.396065 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:39.630390 master-0 kubenswrapper[13046]: I0308 03:17:39.630274 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:17:39.630709 master-0 kubenswrapper[13046]: I0308 03:17:39.630573 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:39.634832 master-0 kubenswrapper[13046]: I0308 03:17:39.634767 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:39.634832 master-0 kubenswrapper[13046]: I0308 03:17:39.634831 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:39.635074 master-0 kubenswrapper[13046]: I0308 03:17:39.634860 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:39.635721 master-0 kubenswrapper[13046]: I0308 03:17:39.635672 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:39.636116 master-0 kubenswrapper[13046]: E0308 03:17:39.636050 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:40.397457 master-0 kubenswrapper[13046]: I0308 03:17:40.397371 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:40.400369 master-0 kubenswrapper[13046]: I0308 03:17:40.400317 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:40.400369 master-0 kubenswrapper[13046]: I0308 03:17:40.400367 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:40.400560 master-0 kubenswrapper[13046]: I0308 03:17:40.400385 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:41.500224 master-0 kubenswrapper[13046]: I0308 03:17:41.500104 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:41.501365 master-0 kubenswrapper[13046]: I0308 03:17:41.500358 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:41.503908 master-0 kubenswrapper[13046]: I0308 03:17:41.503844 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:41.503908 master-0 kubenswrapper[13046]: I0308 03:17:41.503909 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:41.504157 master-0 kubenswrapper[13046]: I0308 03:17:41.503935 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:41.510424 master-0 kubenswrapper[13046]: I0308 03:17:41.510365 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:42.199646 master-0 kubenswrapper[13046]: I0308 03:17:42.198972 13046 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 03:17:42.199940 master-0 kubenswrapper[13046]: E0308 03:17:42.199816 13046 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 08 03:17:42.408402 master-0 kubenswrapper[13046]: I0308 03:17:42.408369 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:42.411042 master-0 kubenswrapper[13046]: I0308 03:17:42.411022 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:42.411174 master-0 kubenswrapper[13046]: I0308 03:17:42.411160 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:42.411264 master-0 kubenswrapper[13046]: I0308 03:17:42.411252 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:42.414579 master-0 kubenswrapper[13046]: I0308 03:17:42.414360 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:43.415116 master-0 kubenswrapper[13046]: I0308 03:17:43.414948 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:43.419243 master-0 kubenswrapper[13046]: I0308 03:17:43.418105 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:43.419243 master-0 kubenswrapper[13046]: I0308 03:17:43.418169 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:43.419243 master-0 kubenswrapper[13046]: I0308 03:17:43.418189 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:45.833470 master-0 kubenswrapper[13046]: I0308 03:17:45.833392 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:17:45.834586 master-0 kubenswrapper[13046]: I0308 03:17:45.834395 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:45.837911 master-0 kubenswrapper[13046]: I0308 03:17:45.837857 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:45.837911 master-0 kubenswrapper[13046]: I0308 03:17:45.837911 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:45.838101 master-0 kubenswrapper[13046]: I0308 03:17:45.837930 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:45.838655 master-0 kubenswrapper[13046]: I0308 03:17:45.838602 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:45.838985 master-0 kubenswrapper[13046]: E0308 03:17:45.838920 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:45.841216 master-0 kubenswrapper[13046]: I0308 03:17:45.841166 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:17:46.435259 master-0 kubenswrapper[13046]: I0308 03:17:46.435178 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:46.438781 master-0 kubenswrapper[13046]: I0308 03:17:46.438720 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:46.438911 master-0 kubenswrapper[13046]: I0308 03:17:46.438829 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:46.438911 master-0 kubenswrapper[13046]: I0308 03:17:46.438853 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:46.439753 master-0 kubenswrapper[13046]: I0308 03:17:46.439697 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:46.440186 master-0 kubenswrapper[13046]: E0308 03:17:46.440132 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:46.444064 master-0 kubenswrapper[13046]: I0308 03:17:46.444009 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:17:47.442364 master-0 kubenswrapper[13046]: I0308 03:17:47.442289 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:47.445458 master-0 kubenswrapper[13046]: I0308 03:17:47.445354 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:47.445458 master-0 kubenswrapper[13046]: I0308 03:17:47.445464 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:47.445755 master-0 kubenswrapper[13046]: I0308 03:17:47.445512 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:47.446223 master-0 kubenswrapper[13046]: I0308 03:17:47.446165 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:17:47.446608 master-0 kubenswrapper[13046]: E0308 03:17:47.446553 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:17:48.735703 master-0 kubenswrapper[13046]: E0308 03:17:48.735641 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:49.200878 master-0 kubenswrapper[13046]: I0308 03:17:49.200784 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:49.204278 master-0 kubenswrapper[13046]: I0308 03:17:49.204215 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:49.204278 master-0 kubenswrapper[13046]: I0308 03:17:49.204277 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:49.204473 master-0 kubenswrapper[13046]: I0308 03:17:49.204297 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:49.204587 master-0 kubenswrapper[13046]: I0308 03:17:49.204549 13046 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 08 03:17:49.219273 master-0 kubenswrapper[13046]: I0308 03:17:49.219206 13046 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 08 03:17:49.219599 master-0 kubenswrapper[13046]: I0308 03:17:49.219364 13046 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 08 03:17:49.219599 master-0 kubenswrapper[13046]: E0308 03:17:49.219394 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 08 03:17:49.255415 master-0 kubenswrapper[13046]: E0308 03:17:49.255351 13046 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 08 03:17:49.356581 master-0 kubenswrapper[13046]: E0308 03:17:49.356469 13046 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 03:17:51.086015 master-0 kubenswrapper[13046]: I0308 03:17:51.085856 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:17:51.086784 master-0 kubenswrapper[13046]: I0308 03:17:51.086073 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:51.089570 master-0 kubenswrapper[13046]: I0308 03:17:51.089478 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:51.089570 master-0 kubenswrapper[13046]: I0308 03:17:51.089565 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:51.089765 master-0 kubenswrapper[13046]: I0308 03:17:51.089583 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:51.293219 master-0 kubenswrapper[13046]: I0308 03:17:51.293143 13046 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:17:52.478232 master-0 kubenswrapper[13046]: I0308 03:17:52.478166 13046 generic.go:334] "Generic (PLEG): container finished" podID="ba9496ed-060e-4118-9da6-89b82bd49263" containerID="73db2b17db7b45f368583714c7423ad3baed3f0e6461afd93878b41dc72e8454" exitCode=0 Mar 08 03:17:55.072352 master-0 kubenswrapper[13046]: I0308 03:17:55.072173 13046 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:17:55.502436 master-0 kubenswrapper[13046]: I0308 03:17:55.502315 13046 generic.go:334] "Generic (PLEG): container finished" podID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerID="ad59cc4c7958a82cb7e8357828383997f6ce39b4d62e09c7ada95209a7513c90" exitCode=0 Mar 08 03:17:55.505709 master-0 kubenswrapper[13046]: I0308 03:17:55.505639 13046 generic.go:334] "Generic (PLEG): container finished" podID="982ea338-c7be-4776-9bb7-113834c54aaa" containerID="6a754fcfb0d67c328aad3537f5cd3aea4c5a542bc823d6a29cf5e7022aa42ed0" exitCode=0 Mar 08 03:17:55.505869 master-0 kubenswrapper[13046]: I0308 03:17:55.505710 13046 scope.go:117] "RemoveContainer" containerID="5921d58846914a3755a54e900c683e987d45bddbc77ff90bbf7ebdecd811ba56" Mar 08 03:17:55.509002 master-0 kubenswrapper[13046]: I0308 03:17:55.508936 13046 generic.go:334] "Generic (PLEG): container finished" podID="fe33f926-9348-4498-a892-d2becaeecc14" containerID="a31cf751005d98b0c093a07cba9d36fdd0b091f0fc3e6728bcde1b51934cdbef" exitCode=0 Mar 08 03:17:56.518300 master-0 kubenswrapper[13046]: I0308 03:17:56.518149 13046 generic.go:334] "Generic (PLEG): container finished" podID="c9f377bf-79c5-4425-b5d1-256961835f62" containerID="d4ea1844b53b95e64939abf18bf680af5d21c94a78af3eaf8fa2b814c48bf2f0" exitCode=0 Mar 08 03:17:57.118056 master-0 kubenswrapper[13046]: I0308 03:17:57.117966 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:17:57.121749 master-0 kubenswrapper[13046]: I0308 03:17:57.121695 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:17:57.121749 master-0 kubenswrapper[13046]: I0308 03:17:57.121742 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:17:57.121944 master-0 kubenswrapper[13046]: I0308 03:17:57.121762 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:17:58.736337 master-0 kubenswrapper[13046]: E0308 03:17:58.736253 13046 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 08 03:17:59.380632 master-0 kubenswrapper[13046]: E0308 03:17:59.380514 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 08 03:18:00.118078 master-0 kubenswrapper[13046]: I0308 03:18:00.117960 13046 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 03:18:00.121280 master-0 kubenswrapper[13046]: I0308 03:18:00.121213 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 08 03:18:00.121356 master-0 kubenswrapper[13046]: I0308 03:18:00.121296 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 08 03:18:00.121356 master-0 kubenswrapper[13046]: I0308 03:18:00.121317 13046 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 08 03:18:00.122098 master-0 kubenswrapper[13046]: I0308 03:18:00.122054 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:00.122470 master-0 kubenswrapper[13046]: E0308 03:18:00.122419 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:00.420770 master-0 kubenswrapper[13046]: I0308 03:18:00.420616 13046 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:18:01.236741 master-0 kubenswrapper[13046]: I0308 03:18:01.236645 13046 apiserver.go:52] "Watching apiserver" Mar 08 03:18:01.259796 master-0 kubenswrapper[13046]: I0308 03:18:01.259702 13046 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:18:01.270569 master-0 kubenswrapper[13046]: I0308 03:18:01.270394 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq","openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4","openshift-dns/dns-default-htnv4","openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n","openshift-etcd/etcd-master-0-master-0","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc","openshift-kube-apiserver/kube-apiserver-master-0","openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6","openshift-kube-scheduler/installer-4-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw","assisted-installer/assisted-installer-controller-9g2h9","openshift-kube-controller-manager/installer-1-master-0","openshift-marketplace/certified-operators-jnlct","openshift-marketplace/community-operators-86z4t","openshift-marketplace/redhat-operators-zm8fd","openshift-multus/multus-additional-cni-plugins-5qjn5","openshift-multus/multus-hfnwm","openshift-service-ca/service-ca-84bfdbbb7f-fqhlq","openshift-network-diagnostics/network-check-target-l5x6h","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq","openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v","openshift-dns-operator/dns-operator-589895fbb7-z45kw","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42","openshift-marketplace/community-operators-zm92r","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk","openshift-kube-controller-manager/installer-2-master-0","openshift-multus/multus-admission-controller-8d675b596-772zs","openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz","openshift-cluster-node-tuning-operator/tuned-ntxqg","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv","kube-system/bootstrap-kube-controller-manager-master-0","openshift-insights/insights-operator-8f89dfddd-zd6kq","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl","openshift-marketplace/redhat-marketplace-nggbb","openshift-network-node-identity/network-node-identity-xjg74","openshift-network-operator/network-operator-7c649bf6d4-98n6d","openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n","openshift-apiserver/apiserver-778796f487-vzb5n","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm","openshift-dns/node-resolver-lmqn7","openshift-etcd/installer-1-master-0","openshift-kube-apiserver/installer-1-master-0","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj","openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr","openshift-controller-manager/controller-manager-6494b94d74-kwkcq","openshift-multus/multus-admission-controller-7769569c45-f85rr","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/certified-operators-r494d","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr","openshift-multus/network-metrics-daemon-jl9tj","openshift-network-operator/iptables-alerter-g86jc","kube-system/bootstrap-kube-scheduler-master-0","openshift-marketplace/redhat-marketplace-d5qh2","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8","openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4","openshift-machine-api/machine-api-operator-84bf6db4f9-qt654","openshift-machine-config-operator/machine-config-daemon-j6n9g","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7","openshift-ingress-operator/ingress-operator-677db989d6-r9m2k","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6","openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf","openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-ovn-kubernetes/ovnkube-node-krdvz","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v"] Mar 08 03:18:01.271043 master-0 kubenswrapper[13046]: I0308 03:18:01.270951 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-9g2h9" Mar 08 03:18:01.279340 master-0 kubenswrapper[13046]: I0308 03:18:01.279275 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:18:01.279340 master-0 kubenswrapper[13046]: I0308 03:18:01.279315 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:18:01.280393 master-0 kubenswrapper[13046]: I0308 03:18:01.279699 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:18:01.280393 master-0 kubenswrapper[13046]: I0308 03:18:01.279770 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.280641 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.280684 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.280756 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.280695 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281128 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281290 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281422 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281444 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281622 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.282568 master-0 kubenswrapper[13046]: I0308 03:18:01.281654 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:18:01.283232 master-0 kubenswrapper[13046]: I0308 03:18:01.283188 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:18:01.283304 master-0 kubenswrapper[13046]: I0308 03:18:01.283253 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:18:01.283369 master-0 kubenswrapper[13046]: I0308 03:18:01.283316 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:18:01.283888 master-0 kubenswrapper[13046]: I0308 03:18:01.283439 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:18:01.283888 master-0 kubenswrapper[13046]: I0308 03:18:01.283522 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:18:01.284055 master-0 kubenswrapper[13046]: I0308 03:18:01.283984 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:18:01.284119 master-0 kubenswrapper[13046]: I0308 03:18:01.284079 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:18:01.284950 master-0 kubenswrapper[13046]: I0308 03:18:01.284753 13046 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="199ae602-a267-48d4-bfaf-162ba27cf027" Mar 08 03:18:01.284950 master-0 kubenswrapper[13046]: I0308 03:18:01.284807 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:18:01.285317 master-0 kubenswrapper[13046]: I0308 03:18:01.285273 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:18:01.285817 master-0 kubenswrapper[13046]: I0308 03:18:01.285779 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.287245 master-0 kubenswrapper[13046]: I0308 03:18:01.287194 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:01.287821 master-0 kubenswrapper[13046]: E0308 03:18:01.287579 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:01.291216 master-0 kubenswrapper[13046]: I0308 03:18:01.291188 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.291558 master-0 kubenswrapper[13046]: I0308 03:18:01.291513 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:18:01.295753 master-0 kubenswrapper[13046]: I0308 03:18:01.295660 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:18:01.307428 master-0 kubenswrapper[13046]: I0308 03:18:01.307369 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" podUID="234638fe-5577-45bc-9094-907c5611da38" containerName="kube-rbac-proxy" containerID="cri-o://384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532" gracePeriod=30 Mar 08 03:18:01.309855 master-0 kubenswrapper[13046]: I0308 03:18:01.309772 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.310548 master-0 kubenswrapper[13046]: I0308 03:18:01.310464 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:18:01.310973 master-0 kubenswrapper[13046]: I0308 03:18:01.310930 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.311192 master-0 kubenswrapper[13046]: I0308 03:18:01.311153 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.311273 master-0 kubenswrapper[13046]: I0308 03:18:01.311211 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:18:01.311403 master-0 kubenswrapper[13046]: I0308 03:18:01.311364 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:18:01.311777 master-0 kubenswrapper[13046]: I0308 03:18:01.311726 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:18:01.318366 master-0 kubenswrapper[13046]: I0308 03:18:01.318308 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:18:01.318672 master-0 kubenswrapper[13046]: I0308 03:18:01.318619 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:18:01.318884 master-0 kubenswrapper[13046]: I0308 03:18:01.318843 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:18:01.319620 master-0 kubenswrapper[13046]: I0308 03:18:01.319055 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.319714 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.319909 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.321311 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.321432 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.322213 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.322384 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.322450 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.322896 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323018 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323058 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323208 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323253 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323337 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323365 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323530 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323627 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323686 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323749 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_e81d3c37-e8d7-44c8-973e-13992380ce85/installer/0.log" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323787 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323828 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323630 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323928 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.323868 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.324004 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.324074 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.324916 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.325947 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.326514 master-0 kubenswrapper[13046]: I0308 03:18:01.326441 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.326961 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327188 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327253 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327434 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327691 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327714 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327917 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.327941 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328103 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328172 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328184 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328271 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328404 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328418 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328635 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328653 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328682 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.331513 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.331708 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.331971 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332140 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332235 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.331747 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332451 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332539 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.328431 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332844 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.333383 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.333953 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.334297 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.332248 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3178dfc0-a35e-418e-a954-cd919b8af88c-config\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.334720 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.334786 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.335056 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.335335 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.335730 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.335785 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336035 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336330 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336383 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336475 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336714 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336873 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.336927 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.337225 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.337107 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.337330 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:18:01.337775 master-0 kubenswrapper[13046]: I0308 03:18:01.337833 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.338558 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3178dfc0-a35e-418e-a954-cd919b8af88c-serving-cert\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.338690 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.338741 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.338879 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.339782 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.339826 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d83aa242-606f-4adc-b689-4aa89625b533-srv-cert\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.337346 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.341766 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.342682 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.342933 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fba73e-c201-4866-bc69-64892ea5bdca-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.343663 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-serving-cert\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.343997 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-config\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.344266 master-0 kubenswrapper[13046]: I0308 03:18:01.344053 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.343896 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344716 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aadbbe97-2a03-40da-846d-252e29661f67-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344805 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fba73e-c201-4866-bc69-64892ea5bdca-config\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344762 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344847 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344961 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345009 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.344964 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bbe9b81-0efb-4caa-bacd-55348cd392c6-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345073 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345171 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345229 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345268 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345301 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345333 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345364 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345390 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345532 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aadbbe97-2a03-40da-846d-252e29661f67-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345423 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345686 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345756 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345826 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-images\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345835 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345901 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.345981 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.346012 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-config\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.346333 master-0 kubenswrapper[13046]: I0308 03:18:01.346049 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.356215 master-0 kubenswrapper[13046]: I0308 03:18:01.356036 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e74c8bb2-e063-4b60-b3fe-651aa534d029-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.357729 master-0 kubenswrapper[13046]: I0308 03:18:01.356730 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedc7538-9cc6-4bf5-9628-e278310d796b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:01.357729 master-0 kubenswrapper[13046]: I0308 03:18:01.357114 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e74c8bb2-e063-4b60-b3fe-651aa534d029-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:01.357729 master-0 kubenswrapper[13046]: I0308 03:18:01.357437 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:01.357867 master-0 kubenswrapper[13046]: I0308 03:18:01.357792 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c0192f3-2e60-42c6-9836-c70a9fa407d5-etcd-client\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:01.358514 master-0 kubenswrapper[13046]: I0308 03:18:01.358125 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e71caa06-6ce7-47c9-a267-21f6b6af9247-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:01.358514 master-0 kubenswrapper[13046]: I0308 03:18:01.358393 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71caa06-6ce7-47c9-a267-21f6b6af9247-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:01.363375 master-0 kubenswrapper[13046]: I0308 03:18:01.363123 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:18:01.367272 master-0 kubenswrapper[13046]: I0308 03:18:01.366473 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:18:01.368048 master-0 kubenswrapper[13046]: I0308 03:18:01.367935 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:18:01.368048 master-0 kubenswrapper[13046]: I0308 03:18:01.367955 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:18:01.374159 master-0 kubenswrapper[13046]: I0308 03:18:01.374121 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:18:01.374751 master-0 kubenswrapper[13046]: I0308 03:18:01.374596 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:18:01.374840 master-0 kubenswrapper[13046]: I0308 03:18:01.374617 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:18:01.375711 master-0 kubenswrapper[13046]: I0308 03:18:01.375659 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:18:01.384580 master-0 kubenswrapper[13046]: I0308 03:18:01.384522 13046 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 08 03:18:01.394467 master-0 kubenswrapper[13046]: I0308 03:18:01.394409 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:18:01.413809 master-0 kubenswrapper[13046]: I0308 03:18:01.413760 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:18:01.434935 master-0 kubenswrapper[13046]: I0308 03:18:01.434892 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:18:01.447613 master-0 kubenswrapper[13046]: I0308 03:18:01.447409 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access\") pod \"e81d3c37-e8d7-44c8-973e-13992380ce85\" (UID: \"e81d3c37-e8d7-44c8-973e-13992380ce85\") " Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447789 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447830 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447851 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447926 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447944 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447969 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.447985 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448001 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448019 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448036 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448053 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448038 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/7d23557f-6bb1-46ce-a56e-d0011c576125-operand-assets\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448071 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448202 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448248 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.448333 master-0 kubenswrapper[13046]: I0308 03:18:01.448324 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448410 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448565 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448682 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448725 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448776 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.448837 master-0 kubenswrapper[13046]: I0308 03:18:01.448686 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-config\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.449056 master-0 kubenswrapper[13046]: I0308 03:18:01.448817 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.449056 master-0 kubenswrapper[13046]: I0308 03:18:01.448932 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b83ab56c-e28d-4e82-ae8f-92649a1448ed-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:01.449056 master-0 kubenswrapper[13046]: I0308 03:18:01.448960 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.449056 master-0 kubenswrapper[13046]: I0308 03:18:01.449036 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2xh\" (UniqueName: \"kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:01.449171 master-0 kubenswrapper[13046]: I0308 03:18:01.449079 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.449171 master-0 kubenswrapper[13046]: I0308 03:18:01.449117 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.449171 master-0 kubenswrapper[13046]: I0308 03:18:01.449154 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:01.449257 master-0 kubenswrapper[13046]: I0308 03:18:01.449187 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.449257 master-0 kubenswrapper[13046]: I0308 03:18:01.449218 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.449313 master-0 kubenswrapper[13046]: I0308 03:18:01.449256 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.449313 master-0 kubenswrapper[13046]: I0308 03:18:01.449290 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.449374 master-0 kubenswrapper[13046]: I0308 03:18:01.449328 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.449374 master-0 kubenswrapper[13046]: I0308 03:18:01.449365 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.449434 master-0 kubenswrapper[13046]: I0308 03:18:01.449401 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.449462 master-0 kubenswrapper[13046]: I0308 03:18:01.449444 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6432d23b-a55a-4131-83d5-5f16419809dd-config\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:01.449523 master-0 kubenswrapper[13046]: I0308 03:18:01.449444 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.449564 master-0 kubenswrapper[13046]: I0308 03:18:01.449540 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.449603 master-0 kubenswrapper[13046]: I0308 03:18:01.449573 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:01.449635 master-0 kubenswrapper[13046]: I0308 03:18:01.449295 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd6b827c-70b0-47ed-b07c-c696343248a8-trusted-ca\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.449635 master-0 kubenswrapper[13046]: I0308 03:18:01.449602 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:01.449697 master-0 kubenswrapper[13046]: I0308 03:18:01.449667 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.449729 master-0 kubenswrapper[13046]: I0308 03:18:01.449706 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:01.449836 master-0 kubenswrapper[13046]: I0308 03:18:01.449745 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d358134e-5625-492c-b4f7-460798631270-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.450089 master-0 kubenswrapper[13046]: I0308 03:18:01.449751 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.450190 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.449670 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/febf6a91-8b78-4b22-93b9-155cb7761fc4-tmpfs\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.450266 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.450366 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.450588 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-982r4\" (UniqueName: \"kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:01.451055 master-0 kubenswrapper[13046]: I0308 03:18:01.450780 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:01.451247 master-0 kubenswrapper[13046]: I0308 03:18:01.450958 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.451247 master-0 kubenswrapper[13046]: I0308 03:18:01.451123 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:18:01.451247 master-0 kubenswrapper[13046]: I0308 03:18:01.451163 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.451329 master-0 kubenswrapper[13046]: I0308 03:18:01.451265 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtt8w\" (UniqueName: \"kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:01.451329 master-0 kubenswrapper[13046]: I0308 03:18:01.451304 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jsw\" (UniqueName: \"kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:01.451392 master-0 kubenswrapper[13046]: I0308 03:18:01.451342 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.451429 master-0 kubenswrapper[13046]: I0308 03:18:01.450517 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.451429 master-0 kubenswrapper[13046]: I0308 03:18:01.450669 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd6b827c-70b0-47ed-b07c-c696343248a8-metrics-tls\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.451429 master-0 kubenswrapper[13046]: I0308 03:18:01.451412 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:18:01.451619 master-0 kubenswrapper[13046]: I0308 03:18:01.450452 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cni-binary-copy\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.451619 master-0 kubenswrapper[13046]: I0308 03:18:01.451573 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-snapshots\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.451678 master-0 kubenswrapper[13046]: I0308 03:18:01.451616 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.451707 master-0 kubenswrapper[13046]: I0308 03:18:01.451686 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.451743 master-0 kubenswrapper[13046]: I0308 03:18:01.451729 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451767 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451814 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451890 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-catalog-content\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451910 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451943 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451964 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.451989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452015 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452040 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452061 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27xv\" (UniqueName: \"kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452067 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3bf93333-b537-4f23-9c77-6a245b290fe3-available-featuregates\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452084 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452155 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452179 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452281 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452306 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452332 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452353 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452373 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452378 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452396 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452427 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452460 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vvq\" (UniqueName: \"kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:01.452527 master-0 kubenswrapper[13046]: I0308 03:18:01.452511 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxwl\" (UniqueName: \"kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl\") pod \"migrator-57ccdf9b5-xps42\" (UID: \"306b824f-dcfb-4e69-9a23-64dfbae61852\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452723 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452760 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xsp\" (UniqueName: \"kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452783 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452806 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpwx\" (UniqueName: \"kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452829 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452854 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452876 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452901 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452923 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452947 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452968 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.452989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453002 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-config\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453017 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453075 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453113 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453157 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453224 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453291 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453332 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:01.453371 master-0 kubenswrapper[13046]: I0308 03:18:01.453368 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453403 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453441 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453511 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453552 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvzs9\" (UniqueName: \"kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453590 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453627 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453665 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453688 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-utilities\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453719 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nwgh\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453774 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7t9\" (UniqueName: \"kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453823 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453877 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453898 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4108f513-acef-473a-ab03-f3761b2bd0d8-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453921 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.453972 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.454004 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:18:01.454044 master-0 kubenswrapper[13046]: I0308 03:18:01.454025 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454079 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454128 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454176 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454222 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454313 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454350 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f99f81a-fd2d-432e-a3bc-e451342650b1-metrics-tls\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454370 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454431 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454502 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b83ab56c-e28d-4e82-ae8f-92649a1448ed-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454630 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454736 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/53254b19-b5b3-4f97-bc64-37be8b2a41b7-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: E0308 03:18:01.454750 13046 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: object "openshift-cluster-machine-approver"/"kube-rbac-proxy" not registered Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454800 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: E0308 03:18:01.454814 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config podName:234638fe-5577-45bc-9094-907c5611da38 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:01.95479421 +0000 UTC m=+284.033561537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config") pod "machine-approver-955fcfb87-jxrq4" (UID: "234638fe-5577-45bc-9094-907c5611da38") : object "openshift-cluster-machine-approver"/"kube-rbac-proxy" not registered Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454843 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454871 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454903 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97fm\" (UniqueName: \"kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454935 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454931 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-ovnkube-identity-cm\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454961 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2rs\" (UniqueName: \"kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.454988 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455090 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-utilities\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455116 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-serving-cert\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455310 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455384 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455425 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455535 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455598 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-webhook-certs\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455725 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455764 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6432d23b-a55a-4131-83d5-5f16419809dd-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455828 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455858 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/982ea338-c7be-4776-9bb7-113834c54aaa-metrics-tls\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455875 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455906 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455943 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.455973 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dskxf\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.456032 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.456116 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhh9\" (UniqueName: \"kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.456145 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.456380 master-0 kubenswrapper[13046]: I0308 03:18:01.456236 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456561 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456707 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456786 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456834 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456872 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456912 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.456954 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457034 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457078 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt69c\" (UniqueName: \"kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457034 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cf6ce1a-c203-4033-86be-be16694a9062-utilities\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457143 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-catalog-content\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457294 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntd2k\" (UniqueName: \"kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457633 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457700 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwh7\" (UniqueName: \"kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457805 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457857 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457898 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457940 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.457981 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458024 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458088 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458128 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458170 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzmjd\" (UniqueName: \"kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd\") pod \"csi-snapshot-controller-7577d6f48-j6jpn\" (UID: \"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458207 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-metrics-certs\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458208 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458327 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458369 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458419 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpd47\" (UniqueName: \"kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458459 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458479 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458521 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458583 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458661 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458788 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458881 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458922 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/275be8d3-df30-46f7-9d0a-806e404dfd57-iptables-alerter-script\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458688 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4108f513-acef-473a-ab03-f3761b2bd0d8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:01.459233 master-0 kubenswrapper[13046]: I0308 03:18:01.458989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459372 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68309159-130a-4ffa-acec-95dc4b795b8f-catalog-content\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459403 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d358134e-5625-492c-b4f7-460798631270-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459517 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459618 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459635 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459687 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459767 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459793 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459799 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459922 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459969 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.459993 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.460080 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.460287 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.460333 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.460376 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.460545 master-0 kubenswrapper[13046]: I0308 03:18:01.460464 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460577 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460653 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460698 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460736 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460770 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460811 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58bm\" (UniqueName: \"kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460852 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460889 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460927 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.460973 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:01.461092 master-0 kubenswrapper[13046]: I0308 03:18:01.461059 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-tmp\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461114 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461153 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461208 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461249 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461288 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461325 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461343 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0a08ddb-1045-4631-ba52-93f3046ebd0a-config\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:01.461383 master-0 kubenswrapper[13046]: I0308 03:18:01.461366 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461403 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461444 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461447 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c9f377bf-79c5-4425-b5d1-256961835f62-signing-cabundle\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461502 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461566 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.461641 master-0 kubenswrapper[13046]: I0308 03:18:01.461613 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.461806 master-0 kubenswrapper[13046]: I0308 03:18:01.461649 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.461806 master-0 kubenswrapper[13046]: I0308 03:18:01.461686 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.461806 master-0 kubenswrapper[13046]: I0308 03:18:01.461719 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.461806 master-0 kubenswrapper[13046]: I0308 03:18:01.461722 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.461913 master-0 kubenswrapper[13046]: E0308 03:18:01.461890 13046 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: object "openshift-cluster-machine-approver"/"machine-approver-tls" not registered Mar 08 03:18:01.461947 master-0 kubenswrapper[13046]: I0308 03:18:01.461905 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d23557f-6bb1-46ce-a56e-d0011c576125-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:01.462040 master-0 kubenswrapper[13046]: E0308 03:18:01.461974 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls podName:234638fe-5577-45bc-9094-907c5611da38 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:01.961950639 +0000 UTC m=+284.040717896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls") pod "machine-approver-955fcfb87-jxrq4" (UID: "234638fe-5577-45bc-9094-907c5611da38") : object "openshift-cluster-machine-approver"/"machine-approver-tls" not registered Mar 08 03:18:01.462218 master-0 kubenswrapper[13046]: I0308 03:18:01.462089 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.462218 master-0 kubenswrapper[13046]: I0308 03:18:01.462144 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.462218 master-0 kubenswrapper[13046]: I0308 03:18:01.462186 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.462381 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9d0240-fc00-4d78-9458-8f53b1876f1b-catalog-content\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464435 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464580 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464632 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556dx\" (UniqueName: \"kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464688 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464924 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.464976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885mp\" (UniqueName: \"kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.465032 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.465076 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.465116 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:01.465402 master-0 kubenswrapper[13046]: I0308 03:18:01.465151 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465408 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465462 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465519 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465552 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465560 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465610 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465649 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465684 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465719 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465752 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465795 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465844 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465878 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0a08ddb-1045-4631-ba52-93f3046ebd0a-serving-cert\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:01.465936 master-0 kubenswrapper[13046]: I0308 03:18:01.465896 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.465956 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.465997 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.466031 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.466071 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.466448 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf93333-b537-4f23-9c77-6a245b290fe3-serving-cert\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.467013 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f822854-b9ac-46f2-b03b-e7215fba9208-srv-cert\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.467030 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.467328 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovnkube-script-lib\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.467387 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.467616 master-0 kubenswrapper[13046]: I0308 03:18:01.467427 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.467976 master-0 kubenswrapper[13046]: I0308 03:18:01.467835 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.467976 master-0 kubenswrapper[13046]: I0308 03:18:01.467883 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.467976 master-0 kubenswrapper[13046]: I0308 03:18:01.467920 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.467976 master-0 kubenswrapper[13046]: I0308 03:18:01.467955 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") pod \"machine-approver-955fcfb87-jxrq4\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.468087 master-0 kubenswrapper[13046]: I0308 03:18:01.467997 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84q5n\" (UniqueName: \"kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:18:01.468123 master-0 kubenswrapper[13046]: I0308 03:18:01.468104 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-tuned\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.468203 master-0 kubenswrapper[13046]: E0308 03:18:01.468177 13046 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: object "openshift-cluster-machine-approver"/"machine-approver-config" not registered Mar 08 03:18:01.468267 master-0 kubenswrapper[13046]: E0308 03:18:01.468248 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config podName:234638fe-5577-45bc-9094-907c5611da38 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:01.968224826 +0000 UTC m=+284.046992053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config") pod "machine-approver-955fcfb87-jxrq4" (UID: "234638fe-5577-45bc-9094-907c5611da38") : object "openshift-cluster-machine-approver"/"machine-approver-config" not registered Mar 08 03:18:01.468303 master-0 kubenswrapper[13046]: I0308 03:18:01.468270 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-daemon-config\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.468381 master-0 kubenswrapper[13046]: I0308 03:18:01.468344 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-ovn-node-metrics-cert\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468447 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468533 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468571 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468606 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468641 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468677 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:01.468750 master-0 kubenswrapper[13046]: I0308 03:18:01.468719 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-env-overrides\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.469047 master-0 kubenswrapper[13046]: I0308 03:18:01.468823 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50ab8f71-42b8-4967-8a0b-016647c59a37-utilities\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:01.469047 master-0 kubenswrapper[13046]: I0308 03:18:01.468919 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.469254 master-0 kubenswrapper[13046]: I0308 03:18:01.469200 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-webhook-cert\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.474069 master-0 kubenswrapper[13046]: I0308 03:18:01.474018 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:18:01.481121 master-0 kubenswrapper[13046]: I0308 03:18:01.481081 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c9f377bf-79c5-4425-b5d1-256961835f62-signing-key\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:01.494006 master-0 kubenswrapper[13046]: I0308 03:18:01.493952 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:18:01.498904 master-0 kubenswrapper[13046]: I0308 03:18:01.498859 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.513758 master-0 kubenswrapper[13046]: I0308 03:18:01.513686 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:18:01.516036 master-0 kubenswrapper[13046]: I0308 03:18:01.515988 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e81d3c37-e8d7-44c8-973e-13992380ce85" (UID: "e81d3c37-e8d7-44c8-973e-13992380ce85"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:01.533467 master-0 kubenswrapper[13046]: I0308 03:18:01.533424 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:18:01.538775 master-0 kubenswrapper[13046]: I0308 03:18:01.538744 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-env-overrides\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:01.569879 master-0 kubenswrapper[13046]: I0308 03:18:01.554142 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:18:01.572346 master-0 kubenswrapper[13046]: I0308 03:18:01.572297 13046 generic.go:334] "Generic (PLEG): container finished" podID="234638fe-5577-45bc-9094-907c5611da38" containerID="384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532" exitCode=0 Mar 08 03:18:01.572583 master-0 kubenswrapper[13046]: I0308 03:18:01.572540 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" Mar 08 03:18:01.572849 master-0 kubenswrapper[13046]: I0308 03:18:01.572818 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" event={"ID":"234638fe-5577-45bc-9094-907c5611da38","Type":"ContainerDied","Data":"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532"} Mar 08 03:18:01.572920 master-0 kubenswrapper[13046]: I0308 03:18:01.572863 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4" event={"ID":"234638fe-5577-45bc-9094-907c5611da38","Type":"ContainerDied","Data":"a153dffc082c8d8e34a6c6e6c0c21f4bb223cf1b6ae19843ae82a4a21f8d697f"} Mar 08 03:18:01.572920 master-0 kubenswrapper[13046]: I0308 03:18:01.572891 13046 scope.go:117] "RemoveContainer" containerID="384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532" Mar 08 03:18:01.573069 master-0 kubenswrapper[13046]: I0308 03:18:01.573042 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.573578 master-0 kubenswrapper[13046]: I0308 03:18:01.573538 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 08 03:18:01.575136 master-0 kubenswrapper[13046]: I0308 03:18:01.575077 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575215 master-0 kubenswrapper[13046]: I0308 03:18:01.575152 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.575215 master-0 kubenswrapper[13046]: I0308 03:18:01.575186 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.575301 master-0 kubenswrapper[13046]: I0308 03:18:01.575220 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.575301 master-0 kubenswrapper[13046]: I0308 03:18:01.575252 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.575301 master-0 kubenswrapper[13046]: I0308 03:18:01.575277 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575440 master-0 kubenswrapper[13046]: I0308 03:18:01.575311 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575440 master-0 kubenswrapper[13046]: I0308 03:18:01.575363 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.575440 master-0 kubenswrapper[13046]: I0308 03:18:01.575397 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.575590 master-0 kubenswrapper[13046]: I0308 03:18:01.575440 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575590 master-0 kubenswrapper[13046]: I0308 03:18:01.575477 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.575590 master-0 kubenswrapper[13046]: I0308 03:18:01.575530 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.575590 master-0 kubenswrapper[13046]: I0308 03:18:01.575570 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575590 master-0 kubenswrapper[13046]: I0308 03:18:01.575588 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575621 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575666 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575683 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575702 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575718 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575752 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575781 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575804 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575820 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575836 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.575857 master-0 kubenswrapper[13046]: I0308 03:18:01.575852 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.575877 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.575946 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.575996 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576041 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576063 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576092 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576130 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576187 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.576237 master-0 kubenswrapper[13046]: I0308 03:18:01.576204 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.576722 master-0 kubenswrapper[13046]: I0308 03:18:01.576279 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.576722 master-0 kubenswrapper[13046]: I0308 03:18:01.576298 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.576722 master-0 kubenswrapper[13046]: I0308 03:18:01.576378 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593062 master-0 kubenswrapper[13046]: I0308 03:18:01.578725 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-kubelet\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593062 master-0 kubenswrapper[13046]: I0308 03:18:01.593031 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.593062 master-0 kubenswrapper[13046]: I0308 03:18:01.578823 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-etc-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593062 master-0 kubenswrapper[13046]: I0308 03:18:01.578867 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.578906 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-k8s-cni-cncf-io\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579040 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-os-release\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579074 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-bin\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579107 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579139 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-netns\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579186 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579220 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-var-lib-openvswitch\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.579257 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/275be8d3-df30-46f7-9d0a-806e404dfd57-host-slash\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580528 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-modprobe-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580564 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-cni-netd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580597 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-systemd\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580636 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-var-lib-kubelet\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580660 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-systemd-units\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580688 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580931 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-systemd\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.580983 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-audit-dir\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581051 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-sys\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581088 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581123 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-hostroot\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581160 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-netns\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581198 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581249 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-kubernetes\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581296 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581359 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-lib-modules\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581401 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581444 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysconfig\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581513 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-run-ovn-kubernetes\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581570 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-cnibin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581619 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-d\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581660 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/982ea338-c7be-4776-9bb7-113834c54aaa-host-etc-kube\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.581700 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.583124 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-multus\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.593479 master-0 kubenswrapper[13046]: I0308 03:18:01.578943 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-run-multus-certs\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593614 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593676 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593712 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593783 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/af653e87-ce5f-4f1a-a20d-233c563694ba-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593794 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593910 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593910 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-dir\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.593970 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594298 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594368 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594427 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594466 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594628 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594683 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/53254b19-b5b3-4f97-bc64-37be8b2a41b7-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594746 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-system-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594777 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-etc-kubernetes\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594872 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594912 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-run\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594943 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-host-var-lib-cni-bin\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594986 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1092f2a6-865c-4706-bba7-068621e85ebc-rootfs\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594988 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-cni-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.594869 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"installer-1-master-0\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595049 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-socket-dir-parent\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595200 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595247 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595323 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595354 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595352 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f99d6808-9fec-402d-93f7-41575a5a0a08-node-pullsecrets\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595460 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595546 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-host\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595557 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-kubelet\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595601 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-cnibin\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595599 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595613 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595712 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595673 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dfe0357f-dab4-4424-869c-f6070b411a35-hosts-file\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595763 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-host-slash\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595765 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595804 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/de90d207-06d6-4778-b1b0-9020b1f2a881-etc-sysctl-conf\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595824 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595867 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595876 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-os-release\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595899 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"installer-4-master-0\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.595863 master-0 kubenswrapper[13046]: I0308 03:18:01.595808 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.595955 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-log-socket\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.595988 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.596680 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-multus-conf-dir\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.596730 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.596793 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.596916 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-node-log\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597029 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-run-ovn\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597503 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597261 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597435 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597592 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597630 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597656 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597690 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-system-cni-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597716 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e81d3c37-e8d7-44c8-973e-13992380ce85-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597741 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76ceb013-e999-4f15-bf25-f8dcd2647f9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:01.598261 master-0 kubenswrapper[13046]: I0308 03:18:01.597744 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.623840 master-0 kubenswrapper[13046]: I0308 03:18:01.613966 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:18:01.623840 master-0 kubenswrapper[13046]: I0308 03:18:01.618956 13046 scope.go:117] "RemoveContainer" containerID="384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532" Mar 08 03:18:01.623840 master-0 kubenswrapper[13046]: E0308 03:18:01.619315 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532\": container with ID starting with 384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532 not found: ID does not exist" containerID="384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532" Mar 08 03:18:01.623840 master-0 kubenswrapper[13046]: I0308 03:18:01.619357 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532"} err="failed to get container status \"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532\": rpc error: code = NotFound desc = could not find container \"384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532\": container with ID starting with 384a981f34075d8ba1a1894f04c222842480dcf6e899a698f012da77548d0532 not found: ID does not exist" Mar 08 03:18:01.623840 master-0 kubenswrapper[13046]: I0308 03:18:01.619605 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:01.638243 master-0 kubenswrapper[13046]: I0308 03:18:01.633632 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:18:01.648983 master-0 kubenswrapper[13046]: I0308 03:18:01.639883 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-client\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.662085 master-0 kubenswrapper[13046]: I0308 03:18:01.662046 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:18:01.663263 master-0 kubenswrapper[13046]: I0308 03:18:01.663230 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:01.672874 master-0 kubenswrapper[13046]: I0308 03:18:01.672837 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:18:01.681188 master-0 kubenswrapper[13046]: I0308 03:18:01.681140 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-image-import-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.693658 master-0 kubenswrapper[13046]: I0308 03:18:01.693626 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:18:01.698851 master-0 kubenswrapper[13046]: I0308 03:18:01.698790 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") pod \"234638fe-5577-45bc-9094-907c5611da38\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " Mar 08 03:18:01.699009 master-0 kubenswrapper[13046]: I0308 03:18:01.698969 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") pod \"234638fe-5577-45bc-9094-907c5611da38\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " Mar 08 03:18:01.699052 master-0 kubenswrapper[13046]: I0308 03:18:01.699031 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") pod \"7ea81472-8a81-45ec-a07d-8710f47a927d\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " Mar 08 03:18:01.699105 master-0 kubenswrapper[13046]: I0308 03:18:01.699081 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") pod \"234638fe-5577-45bc-9094-907c5611da38\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " Mar 08 03:18:01.699274 master-0 kubenswrapper[13046]: I0308 03:18:01.699239 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") pod \"7ea81472-8a81-45ec-a07d-8710f47a927d\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " Mar 08 03:18:01.699555 master-0 kubenswrapper[13046]: I0308 03:18:01.699519 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "234638fe-5577-45bc-9094-907c5611da38" (UID: "234638fe-5577-45bc-9094-907c5611da38"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:18:01.699612 master-0 kubenswrapper[13046]: I0308 03:18:01.699586 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock" (OuterVolumeSpecName: "var-lock") pod "7ea81472-8a81-45ec-a07d-8710f47a927d" (UID: "7ea81472-8a81-45ec-a07d-8710f47a927d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:01.700319 master-0 kubenswrapper[13046]: I0308 03:18:01.699875 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7ea81472-8a81-45ec-a07d-8710f47a927d" (UID: "7ea81472-8a81-45ec-a07d-8710f47a927d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:01.700319 master-0 kubenswrapper[13046]: I0308 03:18:01.700226 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config" (OuterVolumeSpecName: "config") pod "234638fe-5577-45bc-9094-907c5611da38" (UID: "234638fe-5577-45bc-9094-907c5611da38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:18:01.700760 master-0 kubenswrapper[13046]: I0308 03:18:01.700724 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.700760 master-0 kubenswrapper[13046]: I0308 03:18:01.700757 13046 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.700834 master-0 kubenswrapper[13046]: I0308 03:18:01.700774 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ea81472-8a81-45ec-a07d-8710f47a927d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.700834 master-0 kubenswrapper[13046]: I0308 03:18:01.700790 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234638fe-5577-45bc-9094-907c5611da38-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.708190 master-0 kubenswrapper[13046]: I0308 03:18:01.708156 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "234638fe-5577-45bc-9094-907c5611da38" (UID: "234638fe-5577-45bc-9094-907c5611da38"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:18:01.713070 master-0 kubenswrapper[13046]: I0308 03:18:01.713040 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:18:01.722874 master-0 kubenswrapper[13046]: I0308 03:18:01.722822 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-serving-cert\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.738913 master-0 kubenswrapper[13046]: I0308 03:18:01.738841 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:18:01.744087 master-0 kubenswrapper[13046]: I0308 03:18:01.743975 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:01.745604 master-0 kubenswrapper[13046]: I0308 03:18:01.745564 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-trusted-ca-bundle\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.754179 master-0 kubenswrapper[13046]: I0308 03:18:01.754122 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:18:01.761827 master-0 kubenswrapper[13046]: I0308 03:18:01.761786 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f99d6808-9fec-402d-93f7-41575a5a0a08-encryption-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.774711 master-0 kubenswrapper[13046]: I0308 03:18:01.774687 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:18:01.779607 master-0 kubenswrapper[13046]: I0308 03:18:01.779575 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:01.781207 master-0 kubenswrapper[13046]: I0308 03:18:01.781163 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-etcd-serving-ca\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.784116 master-0 kubenswrapper[13046]: I0308 03:18:01.784081 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:01.793519 master-0 kubenswrapper[13046]: I0308 03:18:01.793446 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:18:01.801417 master-0 kubenswrapper[13046]: I0308 03:18:01.801361 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") pod \"1a6e3f01-0f22-4961-b450-56aca5477943\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " Mar 08 03:18:01.801417 master-0 kubenswrapper[13046]: I0308 03:18:01.801412 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") pod \"7e324f6c-ee4c-42bc-b241-9c6938749854\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " Mar 08 03:18:01.801956 master-0 kubenswrapper[13046]: I0308 03:18:01.801544 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") pod \"1a6e3f01-0f22-4961-b450-56aca5477943\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " Mar 08 03:18:01.801956 master-0 kubenswrapper[13046]: I0308 03:18:01.801633 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") pod \"b05d5093-20f4-42d5-9db3-811e049cc1b6\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " Mar 08 03:18:01.801956 master-0 kubenswrapper[13046]: I0308 03:18:01.801663 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") pod \"7e324f6c-ee4c-42bc-b241-9c6938749854\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " Mar 08 03:18:01.801956 master-0 kubenswrapper[13046]: I0308 03:18:01.801693 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") pod \"b05d5093-20f4-42d5-9db3-811e049cc1b6\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " Mar 08 03:18:01.802945 master-0 kubenswrapper[13046]: I0308 03:18:01.802905 13046 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/234638fe-5577-45bc-9094-907c5611da38-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.804027 master-0 kubenswrapper[13046]: I0308 03:18:01.803975 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities" (OuterVolumeSpecName: "utilities") pod "b05d5093-20f4-42d5-9db3-811e049cc1b6" (UID: "b05d5093-20f4-42d5-9db3-811e049cc1b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.804414 master-0 kubenswrapper[13046]: I0308 03:18:01.804366 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a6e3f01-0f22-4961-b450-56aca5477943" (UID: "1a6e3f01-0f22-4961-b450-56aca5477943"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.805641 master-0 kubenswrapper[13046]: I0308 03:18:01.805593 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities" (OuterVolumeSpecName: "utilities") pod "7e324f6c-ee4c-42bc-b241-9c6938749854" (UID: "7e324f6c-ee4c-42bc-b241-9c6938749854"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.806684 master-0 kubenswrapper[13046]: I0308 03:18:01.806640 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities" (OuterVolumeSpecName: "utilities") pod "1a6e3f01-0f22-4961-b450-56aca5477943" (UID: "1a6e3f01-0f22-4961-b450-56aca5477943"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.806953 master-0 kubenswrapper[13046]: I0308 03:18:01.806911 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b05d5093-20f4-42d5-9db3-811e049cc1b6" (UID: "b05d5093-20f4-42d5-9db3-811e049cc1b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.807207 master-0 kubenswrapper[13046]: I0308 03:18:01.807162 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e324f6c-ee4c-42bc-b241-9c6938749854" (UID: "7e324f6c-ee4c-42bc-b241-9c6938749854"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:18:01.813524 master-0 kubenswrapper[13046]: I0308 03:18:01.813474 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:18:01.821026 master-0 kubenswrapper[13046]: I0308 03:18:01.820830 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-config\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.840091 master-0 kubenswrapper[13046]: I0308 03:18:01.840018 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:18:01.846284 master-0 kubenswrapper[13046]: I0308 03:18:01.845512 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/53254b19-b5b3-4f97-bc64-37be8b2a41b7-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.874094 master-0 kubenswrapper[13046]: I0308 03:18:01.874010 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:18:01.875240 master-0 kubenswrapper[13046]: I0308 03:18:01.875190 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:18:01.883542 master-0 kubenswrapper[13046]: I0308 03:18:01.883464 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:01.893190 master-0 kubenswrapper[13046]: I0308 03:18:01.893136 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:18:01.902365 master-0 kubenswrapper[13046]: I0308 03:18:01.902302 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f99d6808-9fec-402d-93f7-41575a5a0a08-audit\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:01.904696 master-0 kubenswrapper[13046]: I0308 03:18:01.904446 13046 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.904696 master-0 kubenswrapper[13046]: I0308 03:18:01.904470 13046 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.904696 master-0 kubenswrapper[13046]: I0308 03:18:01.904502 13046 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.904696 master-0 kubenswrapper[13046]: I0308 03:18:01.904687 13046 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b05d5093-20f4-42d5-9db3-811e049cc1b6-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.904696 master-0 kubenswrapper[13046]: I0308 03:18:01.904697 13046 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a6e3f01-0f22-4961-b450-56aca5477943-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.905028 master-0 kubenswrapper[13046]: I0308 03:18:01.904725 13046 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e324f6c-ee4c-42bc-b241-9c6938749854-utilities\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:01.913978 master-0 kubenswrapper[13046]: I0308 03:18:01.913918 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:18:01.934545 master-0 kubenswrapper[13046]: I0308 03:18:01.934453 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:18:01.945053 master-0 kubenswrapper[13046]: I0308 03:18:01.945015 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_2dc664e3-7f37-4fba-8104-544ffb18c1bd/installer/0.log" Mar 08 03:18:01.945182 master-0 kubenswrapper[13046]: I0308 03:18:01.945078 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:01.950686 master-0 kubenswrapper[13046]: I0308 03:18:01.950658 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_0781e6af-f5b5-40f7-bb7f-5bc6978b4957/installer/0.log" Mar 08 03:18:01.950907 master-0 kubenswrapper[13046]: I0308 03:18:01.950720 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:01.954043 master-0 kubenswrapper[13046]: I0308 03:18:01.954008 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:18:01.959885 master-0 kubenswrapper[13046]: I0308 03:18:01.959837 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_acb74744-fb99-4663-a7d0-7bae2db205e9/installer/0.log" Mar 08 03:18:01.959885 master-0 kubenswrapper[13046]: I0308 03:18:01.959878 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:01.973020 master-0 kubenswrapper[13046]: I0308 03:18:01.972993 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:18:01.976117 master-0 kubenswrapper[13046]: I0308 03:18:01.976082 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bda3bd48-6de3-49b0-b2ce-96d97e97f178-metrics-tls\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:01.994551 master-0 kubenswrapper[13046]: I0308 03:18:01.994405 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:18:01.998902 master-0 kubenswrapper[13046]: I0308 03:18:01.998869 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bda3bd48-6de3-49b0-b2ce-96d97e97f178-config-volume\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:02.005889 master-0 kubenswrapper[13046]: I0308 03:18:02.005841 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") pod \"acb74744-fb99-4663-a7d0-7bae2db205e9\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " Mar 08 03:18:02.006016 master-0 kubenswrapper[13046]: I0308 03:18:02.005952 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") pod \"acb74744-fb99-4663-a7d0-7bae2db205e9\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " Mar 08 03:18:02.006016 master-0 kubenswrapper[13046]: I0308 03:18:02.005995 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") pod \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " Mar 08 03:18:02.006016 master-0 kubenswrapper[13046]: I0308 03:18:02.005995 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "acb74744-fb99-4663-a7d0-7bae2db205e9" (UID: "acb74744-fb99-4663-a7d0-7bae2db205e9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006061 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock" (OuterVolumeSpecName: "var-lock") pod "acb74744-fb99-4663-a7d0-7bae2db205e9" (UID: "acb74744-fb99-4663-a7d0-7bae2db205e9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006114 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") pod \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006148 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") pod \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006181 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") pod \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006191 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock" (OuterVolumeSpecName: "var-lock") pod "0781e6af-f5b5-40f7-bb7f-5bc6978b4957" (UID: "0781e6af-f5b5-40f7-bb7f-5bc6978b4957"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.006259 master-0 kubenswrapper[13046]: I0308 03:18:02.006239 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock" (OuterVolumeSpecName: "var-lock") pod "2dc664e3-7f37-4fba-8104-544ffb18c1bd" (UID: "2dc664e3-7f37-4fba-8104-544ffb18c1bd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.006666 master-0 kubenswrapper[13046]: I0308 03:18:02.006273 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2dc664e3-7f37-4fba-8104-544ffb18c1bd" (UID: "2dc664e3-7f37-4fba-8104-544ffb18c1bd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.006666 master-0 kubenswrapper[13046]: I0308 03:18:02.006387 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0781e6af-f5b5-40f7-bb7f-5bc6978b4957" (UID: "0781e6af-f5b5-40f7-bb7f-5bc6978b4957"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:02.007814 master-0 kubenswrapper[13046]: I0308 03:18:02.007773 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.007908 master-0 kubenswrapper[13046]: I0308 03:18:02.007814 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.007908 master-0 kubenswrapper[13046]: I0308 03:18:02.007837 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.007908 master-0 kubenswrapper[13046]: I0308 03:18:02.007857 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.007908 master-0 kubenswrapper[13046]: I0308 03:18:02.007876 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acb74744-fb99-4663-a7d0-7bae2db205e9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.007908 master-0 kubenswrapper[13046]: I0308 03:18:02.007896 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:02.013737 master-0 kubenswrapper[13046]: I0308 03:18:02.013654 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:18:02.023130 master-0 kubenswrapper[13046]: I0308 03:18:02.023060 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-client\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.034170 master-0 kubenswrapper[13046]: I0308 03:18:02.034110 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:18:02.042207 master-0 kubenswrapper[13046]: I0308 03:18:02.042144 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-audit-policies\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.053360 master-0 kubenswrapper[13046]: I0308 03:18:02.053314 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:18:02.056891 master-0 kubenswrapper[13046]: I0308 03:18:02.056837 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-serving-cert\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.074237 master-0 kubenswrapper[13046]: I0308 03:18:02.074187 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:18:02.076808 master-0 kubenswrapper[13046]: I0308 03:18:02.076768 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dac2b210-2fbb-4d25-a0ea-1825259cee3b-encryption-config\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.095238 master-0 kubenswrapper[13046]: I0308 03:18:02.095184 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:18:02.105801 master-0 kubenswrapper[13046]: I0308 03:18:02.105722 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-etcd-serving-ca\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.113320 master-0 kubenswrapper[13046]: I0308 03:18:02.113262 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:18:02.120532 master-0 kubenswrapper[13046]: I0308 03:18:02.120420 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dac2b210-2fbb-4d25-a0ea-1825259cee3b-trusted-ca-bundle\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:02.129720 master-0 kubenswrapper[13046]: I0308 03:18:02.129647 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 08 03:18:02.134758 master-0 kubenswrapper[13046]: I0308 03:18:02.134703 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:18:02.154357 master-0 kubenswrapper[13046]: I0308 03:18:02.154284 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:18:02.175003 master-0 kubenswrapper[13046]: I0308 03:18:02.174954 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:18:02.194172 master-0 kubenswrapper[13046]: I0308 03:18:02.194113 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:18:02.201170 master-0 kubenswrapper[13046]: I0308 03:18:02.201099 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af653e87-ce5f-4f1a-a20d-233c563694ba-serving-cert\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:02.214084 master-0 kubenswrapper[13046]: I0308 03:18:02.214032 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-r5m92" Mar 08 03:18:02.233824 master-0 kubenswrapper[13046]: I0308 03:18:02.233752 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:18:02.235258 master-0 kubenswrapper[13046]: I0308 03:18:02.235200 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/af653e87-ce5f-4f1a-a20d-233c563694ba-service-ca\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:02.253706 master-0 kubenswrapper[13046]: I0308 03:18:02.253582 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-chsmd" Mar 08 03:18:02.273451 master-0 kubenswrapper[13046]: I0308 03:18:02.273365 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:18:02.279362 master-0 kubenswrapper[13046]: I0308 03:18:02.279302 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-webhook-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:02.279539 master-0 kubenswrapper[13046]: I0308 03:18:02.279431 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/febf6a91-8b78-4b22-93b9-155cb7761fc4-apiservice-cert\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:02.293645 master-0 kubenswrapper[13046]: I0308 03:18:02.293572 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-4dw5m" Mar 08 03:18:02.312559 master-0 kubenswrapper[13046]: I0308 03:18:02.312454 13046 request.go:700] Waited for 1.01559892s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dcommunity-operators-dockercfg-lwkgm&limit=500&resourceVersion=0 Mar 08 03:18:02.314662 master-0 kubenswrapper[13046]: I0308 03:18:02.314614 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-lwkgm" Mar 08 03:18:02.334617 master-0 kubenswrapper[13046]: I0308 03:18:02.334552 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-dmv4m" Mar 08 03:18:02.353813 master-0 kubenswrapper[13046]: I0308 03:18:02.353750 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:18:02.358163 master-0 kubenswrapper[13046]: I0308 03:18:02.358097 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:18:02.376248 master-0 kubenswrapper[13046]: I0308 03:18:02.376186 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-5lx9s" Mar 08 03:18:02.394172 master-0 kubenswrapper[13046]: I0308 03:18:02.394120 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qktgm" Mar 08 03:18:02.414370 master-0 kubenswrapper[13046]: I0308 03:18:02.414308 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:18:02.423149 master-0 kubenswrapper[13046]: I0308 03:18:02.423083 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e569889-4759-4046-b0ed-e550078521c6-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:18:02.433892 master-0 kubenswrapper[13046]: I0308 03:18:02.433829 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:18:02.437919 master-0 kubenswrapper[13046]: I0308 03:18:02.437851 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-service-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:02.448834 master-0 kubenswrapper[13046]: E0308 03:18:02.448783 13046 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.448982 master-0 kubenswrapper[13046]: E0308 03:18:02.448868 13046 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.448982 master-0 kubenswrapper[13046]: E0308 03:18:02.448870 13046 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.448982 master-0 kubenswrapper[13046]: E0308 03:18:02.448949 13046 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.448982 master-0 kubenswrapper[13046]: E0308 03:18:02.448957 13046 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.449374 master-0 kubenswrapper[13046]: E0308 03:18:02.449059 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle podName:b33ed2de-435b-4ccc-8dfd-29d52bf95ea8 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.948856578 +0000 UTC m=+285.027623835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle") pod "insights-operator-8f89dfddd-zd6kq" (UID: "b33ed2de-435b-4ccc-8dfd-29d52bf95ea8") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.449374 master-0 kubenswrapper[13046]: E0308 03:18:02.449160 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert podName:6bd07fa0-00f3-4267-b64a-1e7c02fdf148 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.949136756 +0000 UTC m=+285.027904103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert") pod "controller-manager-6494b94d74-kwkcq" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.449374 master-0 kubenswrapper[13046]: E0308 03:18:02.449207 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images podName:52836130-d42e-495c-adbf-19ff9a393347 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.949190687 +0000 UTC m=+285.027958064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images") pod "cluster-cloud-controller-manager-operator-559568b945-kkm7z" (UID: "52836130-d42e-495c-adbf-19ff9a393347") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.449374 master-0 kubenswrapper[13046]: E0308 03:18:02.449247 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config podName:9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.949232358 +0000 UTC m=+285.027999735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config") pod "route-controller-manager-6fbc9556d8-l758n" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.449374 master-0 kubenswrapper[13046]: E0308 03:18:02.449294 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles podName:6bd07fa0-00f3-4267-b64a-1e7c02fdf148 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.94927577 +0000 UTC m=+285.028043147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles") pod "controller-manager-6494b94d74-kwkcq" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.450326 master-0 kubenswrapper[13046]: E0308 03:18:02.450265 13046 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.450408 master-0 kubenswrapper[13046]: E0308 03:18:02.450342 13046 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.450408 master-0 kubenswrapper[13046]: E0308 03:18:02.450385 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config podName:1092f2a6-865c-4706-bba7-068621e85ebc nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.950353218 +0000 UTC m=+285.029120495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config") pod "machine-config-daemon-j6n9g" (UID: "1092f2a6-865c-4706-bba7-068621e85ebc") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.450564 master-0 kubenswrapper[13046]: E0308 03:18:02.450427 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls podName:fe33f926-9348-4498-a892-d2becaeecc14 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.95040552 +0000 UTC m=+285.029172797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls") pod "machine-config-operator-fdb5c78b5-dddvl" (UID: "fe33f926-9348-4498-a892-d2becaeecc14") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.451909 master-0 kubenswrapper[13046]: E0308 03:18:02.451809 13046 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.452038 master-0 kubenswrapper[13046]: E0308 03:18:02.451997 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert podName:17eaab63-9ba9-4a4a-891d-a76aa3f03b46 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.951969161 +0000 UTC m=+285.030736478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert") pod "cluster-autoscaler-operator-69576476f7-cpnw6" (UID: "17eaab63-9ba9-4a4a-891d-a76aa3f03b46") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.453053 master-0 kubenswrapper[13046]: E0308 03:18:02.452947 13046 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.453139 master-0 kubenswrapper[13046]: E0308 03:18:02.453115 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls podName:1092f2a6-865c-4706-bba7-068621e85ebc nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.953095671 +0000 UTC m=+285.031862918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls") pod "machine-config-daemon-j6n9g" (UID: "1092f2a6-865c-4706-bba7-068621e85ebc") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.453332 master-0 kubenswrapper[13046]: I0308 03:18:02.453288 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:18:02.453658 master-0 kubenswrapper[13046]: E0308 03:18:02.453614 13046 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.453755 master-0 kubenswrapper[13046]: E0308 03:18:02.453705 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls podName:52836130-d42e-495c-adbf-19ff9a393347 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.953682536 +0000 UTC m=+285.032449863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-559568b945-kkm7z" (UID: "52836130-d42e-495c-adbf-19ff9a393347") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.453834 master-0 kubenswrapper[13046]: E0308 03:18:02.453789 13046 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.453898 master-0 kubenswrapper[13046]: E0308 03:18:02.453853 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config podName:fe33f926-9348-4498-a892-d2becaeecc14 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.9538334 +0000 UTC m=+285.032600717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-dddvl" (UID: "fe33f926-9348-4498-a892-d2becaeecc14") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.453970 master-0 kubenswrapper[13046]: E0308 03:18:02.453931 13046 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.454064 master-0 kubenswrapper[13046]: E0308 03:18:02.453997 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca podName:9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.953977684 +0000 UTC m=+285.032745051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca") pod "route-controller-manager-6fbc9556d8-l758n" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.455250 master-0 kubenswrapper[13046]: E0308 03:18:02.455176 13046 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.455369 master-0 kubenswrapper[13046]: E0308 03:18:02.455299 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca podName:6bd07fa0-00f3-4267-b64a-1e7c02fdf148 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.955268858 +0000 UTC m=+285.034036106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca") pod "controller-manager-6494b94d74-kwkcq" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.455369 master-0 kubenswrapper[13046]: E0308 03:18:02.455331 13046 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.455618 master-0 kubenswrapper[13046]: E0308 03:18:02.455377 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert podName:b33ed2de-435b-4ccc-8dfd-29d52bf95ea8 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.955363771 +0000 UTC m=+285.034131028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert") pod "insights-operator-8f89dfddd-zd6kq" (UID: "b33ed2de-435b-4ccc-8dfd-29d52bf95ea8") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.456648 master-0 kubenswrapper[13046]: I0308 03:18:02.456595 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:02.457574 master-0 kubenswrapper[13046]: E0308 03:18:02.457522 13046 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.457679 master-0 kubenswrapper[13046]: E0308 03:18:02.457577 13046 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.457679 master-0 kubenswrapper[13046]: E0308 03:18:02.457622 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca podName:caa3a50c-1291-4152-a48a-f7c7b49627db nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.95758982 +0000 UTC m=+285.036357067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca") pod "cloud-credential-operator-55d85b7b47-rggnq" (UID: "caa3a50c-1291-4152-a48a-f7c7b49627db") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.457811 master-0 kubenswrapper[13046]: E0308 03:18:02.457690 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images podName:fe33f926-9348-4498-a892-d2becaeecc14 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.957666612 +0000 UTC m=+285.036433869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images") pod "machine-config-operator-fdb5c78b5-dddvl" (UID: "fe33f926-9348-4498-a892-d2becaeecc14") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.459062 master-0 kubenswrapper[13046]: E0308 03:18:02.459008 13046 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.459177 master-0 kubenswrapper[13046]: E0308 03:18:02.459087 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert podName:caa3a50c-1291-4152-a48a-f7c7b49627db nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.959068569 +0000 UTC m=+285.037835826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-55d85b7b47-rggnq" (UID: "caa3a50c-1291-4152-a48a-f7c7b49627db") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.459177 master-0 kubenswrapper[13046]: E0308 03:18:02.459129 13046 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.459177 master-0 kubenswrapper[13046]: E0308 03:18:02.459169 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert podName:9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.959156502 +0000 UTC m=+285.037923759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert") pod "route-controller-manager-6fbc9556d8-l758n" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.460385 master-0 kubenswrapper[13046]: E0308 03:18:02.460320 13046 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.460385 master-0 kubenswrapper[13046]: E0308 03:18:02.460377 13046 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.460726 master-0 kubenswrapper[13046]: E0308 03:18:02.460429 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls podName:33c15b06-a21e-411f-b324-3ae0c7f0e9a4 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.960403845 +0000 UTC m=+285.039171192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-lhvrm" (UID: "33c15b06-a21e-411f-b324-3ae0c7f0e9a4") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.460726 master-0 kubenswrapper[13046]: E0308 03:18:02.460475 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config podName:52836130-d42e-495c-adbf-19ff9a393347 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.960453626 +0000 UTC m=+285.039220993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-559568b945-kkm7z" (UID: "52836130-d42e-495c-adbf-19ff9a393347") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.462730 master-0 kubenswrapper[13046]: E0308 03:18:02.462680 13046 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.462873 master-0 kubenswrapper[13046]: E0308 03:18:02.462786 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls podName:c3729e29-4c57-4f9b-8202-a87fd3a9a722 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.962766217 +0000 UTC m=+285.041533474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-qt654" (UID: "c3729e29-4c57-4f9b-8202-a87fd3a9a722") : failed to sync secret cache: timed out waiting for the condition Mar 08 03:18:02.464925 master-0 kubenswrapper[13046]: E0308 03:18:02.464863 13046 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.465070 master-0 kubenswrapper[13046]: E0308 03:18:02.464949 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config podName:c3729e29-4c57-4f9b-8202-a87fd3a9a722 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.964932815 +0000 UTC m=+285.043700062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config") pod "machine-api-operator-84bf6db4f9-qt654" (UID: "c3729e29-4c57-4f9b-8202-a87fd3a9a722") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.466878 master-0 kubenswrapper[13046]: E0308 03:18:02.466819 13046 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.466976 master-0 kubenswrapper[13046]: E0308 03:18:02.466918 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config podName:6bd07fa0-00f3-4267-b64a-1e7c02fdf148 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.966898877 +0000 UTC m=+285.045666124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config") pod "controller-manager-6494b94d74-kwkcq" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.468067 master-0 kubenswrapper[13046]: E0308 03:18:02.468013 13046 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.468221 master-0 kubenswrapper[13046]: E0308 03:18:02.468180 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images podName:c3729e29-4c57-4f9b-8202-a87fd3a9a722 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:02.968123759 +0000 UTC m=+285.046891016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images") pod "machine-api-operator-84bf6db4f9-qt654" (UID: "c3729e29-4c57-4f9b-8202-a87fd3a9a722") : failed to sync configmap cache: timed out waiting for the condition Mar 08 03:18:02.493953 master-0 kubenswrapper[13046]: I0308 03:18:02.493847 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:18:02.493953 master-0 kubenswrapper[13046]: I0308 03:18:02.493932 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:18:02.515454 master-0 kubenswrapper[13046]: I0308 03:18:02.515276 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l8646" Mar 08 03:18:02.534634 master-0 kubenswrapper[13046]: I0308 03:18:02.534547 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:18:02.556607 master-0 kubenswrapper[13046]: I0308 03:18:02.555663 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:18:02.575382 master-0 kubenswrapper[13046]: I0308 03:18:02.574341 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:18:02.584911 master-0 kubenswrapper[13046]: I0308 03:18:02.584785 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:02.587877 master-0 kubenswrapper[13046]: I0308 03:18:02.587839 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_2dc664e3-7f37-4fba-8104-544ffb18c1bd/installer/0.log" Mar 08 03:18:02.588444 master-0 kubenswrapper[13046]: I0308 03:18:02.588046 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 08 03:18:02.592800 master-0 kubenswrapper[13046]: I0308 03:18:02.592429 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:02.596794 master-0 kubenswrapper[13046]: I0308 03:18:02.596676 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_acb74744-fb99-4663-a7d0-7bae2db205e9/installer/0.log" Mar 08 03:18:02.597024 master-0 kubenswrapper[13046]: I0308 03:18:02.596931 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 08 03:18:02.600143 master-0 kubenswrapper[13046]: I0308 03:18:02.600084 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_0781e6af-f5b5-40f7-bb7f-5bc6978b4957/installer/0.log" Mar 08 03:18:02.600327 master-0 kubenswrapper[13046]: I0308 03:18:02.600307 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 08 03:18:02.603796 master-0 kubenswrapper[13046]: I0308 03:18:02.603746 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 08 03:18:02.603796 master-0 kubenswrapper[13046]: I0308 03:18:02.603790 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:02.605145 master-0 kubenswrapper[13046]: I0308 03:18:02.605104 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:18:02.614958 master-0 kubenswrapper[13046]: I0308 03:18:02.614884 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:18:02.633527 master-0 kubenswrapper[13046]: I0308 03:18:02.633448 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:18:02.654055 master-0 kubenswrapper[13046]: I0308 03:18:02.654006 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:18:02.673676 master-0 kubenswrapper[13046]: I0308 03:18:02.673627 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-xfphx" Mar 08 03:18:02.693853 master-0 kubenswrapper[13046]: I0308 03:18:02.693793 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:18:02.713587 master-0 kubenswrapper[13046]: I0308 03:18:02.713537 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:18:02.735098 master-0 kubenswrapper[13046]: I0308 03:18:02.735032 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-svw57" Mar 08 03:18:02.753819 master-0 kubenswrapper[13046]: I0308 03:18:02.753743 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-twhrj" Mar 08 03:18:02.774350 master-0 kubenswrapper[13046]: I0308 03:18:02.773900 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:18:02.793296 master-0 kubenswrapper[13046]: I0308 03:18:02.793241 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:18:02.823535 master-0 kubenswrapper[13046]: I0308 03:18:02.823409 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:18:02.834744 master-0 kubenswrapper[13046]: I0308 03:18:02.834444 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rdhz7" Mar 08 03:18:02.855325 master-0 kubenswrapper[13046]: I0308 03:18:02.855257 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-rkgwg" Mar 08 03:18:02.875135 master-0 kubenswrapper[13046]: I0308 03:18:02.874816 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:18:02.894012 master-0 kubenswrapper[13046]: I0308 03:18:02.893947 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:18:02.913607 master-0 kubenswrapper[13046]: I0308 03:18:02.913479 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:18:02.933978 master-0 kubenswrapper[13046]: I0308 03:18:02.933917 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-sbm8j" Mar 08 03:18:02.954001 master-0 kubenswrapper[13046]: I0308 03:18:02.953926 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:18:02.974270 master-0 kubenswrapper[13046]: I0308 03:18:02.974196 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:18:02.994108 master-0 kubenswrapper[13046]: I0308 03:18:02.994033 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:18:03.013990 master-0 kubenswrapper[13046]: I0308 03:18:03.013906 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:18:03.032300 master-0 kubenswrapper[13046]: I0308 03:18:03.032150 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:03.032300 master-0 kubenswrapper[13046]: I0308 03:18:03.032245 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.032710 master-0 kubenswrapper[13046]: I0308 03:18:03.032649 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:03.032831 master-0 kubenswrapper[13046]: I0308 03:18:03.032795 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.032896 master-0 kubenswrapper[13046]: I0308 03:18:03.032859 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.033060 master-0 kubenswrapper[13046]: I0308 03:18:03.032918 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:18:03.033060 master-0 kubenswrapper[13046]: I0308 03:18:03.032958 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/caa3a50c-1291-4152-a48a-f7c7b49627db-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:03.033366 master-0 kubenswrapper[13046]: I0308 03:18:03.033297 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.033476 master-0 kubenswrapper[13046]: I0308 03:18:03.033368 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.033611 master-0 kubenswrapper[13046]: I0308 03:18:03.033546 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/caa3a50c-1291-4152-a48a-f7c7b49627db-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:03.033611 master-0 kubenswrapper[13046]: I0308 03:18:03.033586 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.033811 master-0 kubenswrapper[13046]: I0308 03:18:03.033701 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.033811 master-0 kubenswrapper[13046]: I0308 03:18:03.033765 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.033987 master-0 kubenswrapper[13046]: I0308 03:18:03.033869 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.033987 master-0 kubenswrapper[13046]: I0308 03:18:03.033926 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.033992 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034123 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034234 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034333 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034347 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034430 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034573 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034638 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034714 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034721 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034927 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe33f926-9348-4498-a892-d2becaeecc14-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.034933 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.035073 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.035096 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-f5lxw" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.035081 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.035210 master-0 kubenswrapper[13046]: I0308 03:18:03.035200 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035308 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035388 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035430 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035587 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035724 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.035754 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-serving-cert\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:03.036724 master-0 kubenswrapper[13046]: I0308 03:18:03.036147 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.053925 master-0 kubenswrapper[13046]: I0308 03:18:03.053863 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:18:03.064530 master-0 kubenswrapper[13046]: I0308 03:18:03.064437 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.074367 master-0 kubenswrapper[13046]: I0308 03:18:03.074275 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:18:03.094379 master-0 kubenswrapper[13046]: I0308 03:18:03.094312 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:18:03.114006 master-0 kubenswrapper[13046]: I0308 03:18:03.113937 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:18:03.123737 master-0 kubenswrapper[13046]: I0308 03:18:03.123669 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:18:03.134214 master-0 kubenswrapper[13046]: I0308 03:18:03.134168 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:18:03.154250 master-0 kubenswrapper[13046]: I0308 03:18:03.154181 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:18:03.163902 master-0 kubenswrapper[13046]: I0308 03:18:03.163848 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3729e29-4c57-4f9b-8202-a87fd3a9a722-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.174726 master-0 kubenswrapper[13046]: I0308 03:18:03.174682 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-r4dpg" Mar 08 03:18:03.194706 master-0 kubenswrapper[13046]: I0308 03:18:03.194627 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:18:03.205269 master-0 kubenswrapper[13046]: I0308 03:18:03.205127 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-config\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.213907 master-0 kubenswrapper[13046]: I0308 03:18:03.213833 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:18:03.216352 master-0 kubenswrapper[13046]: I0308 03:18:03.216304 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-cert\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:03.234479 master-0 kubenswrapper[13046]: I0308 03:18:03.234421 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:18:03.235965 master-0 kubenswrapper[13046]: I0308 03:18:03.235901 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1092f2a6-865c-4706-bba7-068621e85ebc-proxy-tls\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:03.253711 master-0 kubenswrapper[13046]: I0308 03:18:03.253654 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:18:03.273726 master-0 kubenswrapper[13046]: I0308 03:18:03.273695 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:18:03.296473 master-0 kubenswrapper[13046]: I0308 03:18:03.296319 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:18:03.305529 master-0 kubenswrapper[13046]: I0308 03:18:03.305429 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3729e29-4c57-4f9b-8202-a87fd3a9a722-images\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.314899 master-0 kubenswrapper[13046]: I0308 03:18:03.314841 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:18:03.331967 master-0 kubenswrapper[13046]: I0308 03:18:03.331881 13046 request.go:700] Waited for 2.005281812s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcluster-autoscaler-operator-dockercfg-tvw7c&limit=500&resourceVersion=0 Mar 08 03:18:03.334468 master-0 kubenswrapper[13046]: I0308 03:18:03.334414 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tvw7c" Mar 08 03:18:03.354822 master-0 kubenswrapper[13046]: I0308 03:18:03.354767 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:18:03.355352 master-0 kubenswrapper[13046]: I0308 03:18:03.355270 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1092f2a6-865c-4706-bba7-068621e85ebc-mcd-auth-proxy-config\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:03.357010 master-0 kubenswrapper[13046]: I0308 03:18:03.356953 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.375061 master-0 kubenswrapper[13046]: I0308 03:18:03.374782 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:18:03.385597 master-0 kubenswrapper[13046]: I0308 03:18:03.385529 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:03.393864 master-0 kubenswrapper[13046]: I0308 03:18:03.393806 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:18:03.403887 master-0 kubenswrapper[13046]: I0308 03:18:03.403828 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe33f926-9348-4498-a892-d2becaeecc14-images\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.438718 master-0 kubenswrapper[13046]: I0308 03:18:03.438634 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghdk\" (UniqueName: \"kubernetes.io/projected/eedc7538-9cc6-4bf5-9628-e278310d796b-kube-api-access-lghdk\") pod \"marketplace-operator-64bf9778cb-7hsbf\" (UID: \"eedc7538-9cc6-4bf5-9628-e278310d796b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:03.455898 master-0 kubenswrapper[13046]: I0308 03:18:03.455587 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lb9w\" (UniqueName: \"kubernetes.io/projected/8c0192f3-2e60-42c6-9836-c70a9fa407d5-kube-api-access-4lb9w\") pod \"etcd-operator-5884b9cd56-gfmq4\" (UID: \"8c0192f3-2e60-42c6-9836-c70a9fa407d5\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:03.478790 master-0 kubenswrapper[13046]: I0308 03:18:03.478707 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf24l\" (UniqueName: \"kubernetes.io/projected/70fba73e-c201-4866-bc69-64892ea5bdca-kube-api-access-wf24l\") pod \"openshift-controller-manager-operator-8565d84698-qsgq7\" (UID: \"70fba73e-c201-4866-bc69-64892ea5bdca\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" Mar 08 03:18:03.485906 master-0 kubenswrapper[13046]: I0308 03:18:03.485848 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbqkj\" (UniqueName: \"kubernetes.io/projected/d83aa242-606f-4adc-b689-4aa89625b533-kube-api-access-hbqkj\") pod \"catalog-operator-7d9c49f57b-vsnbw\" (UID: \"d83aa242-606f-4adc-b689-4aa89625b533\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:03.493894 master-0 kubenswrapper[13046]: I0308 03:18:03.493838 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-29dgn" Mar 08 03:18:03.530245 master-0 kubenswrapper[13046]: I0308 03:18:03.530153 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mv56\" (UniqueName: \"kubernetes.io/projected/ba9496ed-060e-4118-9da6-89b82bd49263-kube-api-access-6mv56\") pod \"csi-snapshot-controller-operator-5685fbc7d-8fxl8\" (UID: \"ba9496ed-060e-4118-9da6-89b82bd49263\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" Mar 08 03:18:03.556624 master-0 kubenswrapper[13046]: I0308 03:18:03.556431 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aadbbe97-2a03-40da-846d-252e29661f67-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-sjdgk\" (UID: \"aadbbe97-2a03-40da-846d-252e29661f67\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" Mar 08 03:18:03.580587 master-0 kubenswrapper[13046]: I0308 03:18:03.577905 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt6t\" (UniqueName: \"kubernetes.io/projected/2bbe9b81-0efb-4caa-bacd-55348cd392c6-kube-api-access-5vt6t\") pod \"package-server-manager-854648ff6d-2gxdj\" (UID: \"2bbe9b81-0efb-4caa-bacd-55348cd392c6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:03.597943 master-0 kubenswrapper[13046]: I0308 03:18:03.597878 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3178dfc0-a35e-418e-a954-cd919b8af88c-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-hg2f6\" (UID: \"3178dfc0-a35e-418e-a954-cd919b8af88c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" Mar 08 03:18:03.665128 master-0 kubenswrapper[13046]: I0308 03:18:03.664877 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfxd\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-kube-api-access-2xfxd\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:03.667914 master-0 kubenswrapper[13046]: I0308 03:18:03.667836 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdxk\" (UniqueName: \"kubernetes.io/projected/e71caa06-6ce7-47c9-a267-21f6b6af9247-kube-api-access-zrdxk\") pod \"kube-storage-version-migrator-operator-7f65c457f5-8wv6c\" (UID: \"e71caa06-6ce7-47c9-a267-21f6b6af9247\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" Mar 08 03:18:03.670645 master-0 kubenswrapper[13046]: I0308 03:18:03.670588 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/0.log" Mar 08 03:18:03.670740 master-0 kubenswrapper[13046]: I0308 03:18:03.670711 13046 generic.go:334] "Generic (PLEG): container finished" podID="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" containerID="8a3da09cabdcb126428fcd447defcc99973fd5db3565d3792f66591da1ac8333" exitCode=1 Mar 08 03:18:03.672615 master-0 kubenswrapper[13046]: I0308 03:18:03.672567 13046 scope.go:117] "RemoveContainer" containerID="a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0" Mar 08 03:18:03.672735 master-0 kubenswrapper[13046]: I0308 03:18:03.672695 13046 scope.go:117] "RemoveContainer" containerID="df3e8baabefc90e04c02f0f45ed7aa89841f1f4954012b9c683b090559c5e516" Mar 08 03:18:03.673942 master-0 kubenswrapper[13046]: I0308 03:18:03.673321 13046 scope.go:117] "RemoveContainer" containerID="73db2b17db7b45f368583714c7423ad3baed3f0e6461afd93878b41dc72e8454" Mar 08 03:18:03.673942 master-0 kubenswrapper[13046]: I0308 03:18:03.673365 13046 scope.go:117] "RemoveContainer" containerID="2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9" Mar 08 03:18:03.676761 master-0 kubenswrapper[13046]: I0308 03:18:03.676732 13046 scope.go:117] "RemoveContainer" containerID="dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62" Mar 08 03:18:03.679096 master-0 kubenswrapper[13046]: I0308 03:18:03.679046 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vmz\" (UniqueName: \"kubernetes.io/projected/5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b-kube-api-access-m2vmz\") pod \"cluster-baremetal-operator-5cdb4c5598-gwv4q\" (UID: \"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" Mar 08 03:18:03.680063 master-0 kubenswrapper[13046]: I0308 03:18:03.680018 13046 scope.go:117] "RemoveContainer" containerID="7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e" Mar 08 03:18:03.680555 master-0 kubenswrapper[13046]: I0308 03:18:03.680503 13046 scope.go:117] "RemoveContainer" containerID="643f3b1d5189adb625272097c9d23e7af0847cd627439de5de3ccca7ed7bb060" Mar 08 03:18:03.681283 master-0 kubenswrapper[13046]: I0308 03:18:03.681244 13046 scope.go:117] "RemoveContainer" containerID="67b6371de1e40f11492bdbedad65b4bb4c5dafeb7f94b97c8372fcadf4c1308d" Mar 08 03:18:03.681370 master-0 kubenswrapper[13046]: I0308 03:18:03.681318 13046 scope.go:117] "RemoveContainer" containerID="5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2" Mar 08 03:18:03.690301 master-0 kubenswrapper[13046]: I0308 03:18:03.690247 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e74c8bb2-e063-4b60-b3fe-651aa534d029-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-4vqgc\" (UID: \"e74c8bb2-e063-4b60-b3fe-651aa534d029\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" Mar 08 03:18:03.732179 master-0 kubenswrapper[13046]: I0308 03:18:03.732133 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfwp\" (UniqueName: \"kubernetes.io/projected/7d23557f-6bb1-46ce-a56e-d0011c576125-kube-api-access-bkfwp\") pod \"cluster-olm-operator-77899cf6d-h4ldq\" (UID: \"7d23557f-6bb1-46ce-a56e-d0011c576125\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" Mar 08 03:18:03.742797 master-0 kubenswrapper[13046]: I0308 03:18:03.742198 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"community-operators-86z4t\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " pod="openshift-marketplace/community-operators-86z4t" Mar 08 03:18:03.770377 master-0 kubenswrapper[13046]: I0308 03:18:03.770279 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") pod \"7e324f6c-ee4c-42bc-b241-9c6938749854\" (UID: \"7e324f6c-ee4c-42bc-b241-9c6938749854\") " Mar 08 03:18:03.770891 master-0 kubenswrapper[13046]: I0308 03:18:03.770745 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2xh\" (UniqueName: \"kubernetes.io/projected/17eaab63-9ba9-4a4a-891d-a76aa3f03b46-kube-api-access-pb2xh\") pod \"cluster-autoscaler-operator-69576476f7-cpnw6\" (UID: \"17eaab63-9ba9-4a4a-891d-a76aa3f03b46\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" Mar 08 03:18:03.771352 master-0 kubenswrapper[13046]: I0308 03:18:03.771274 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hzl\" (UniqueName: \"kubernetes.io/projected/7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c-kube-api-access-m7hzl\") pod \"network-metrics-daemon-jl9tj\" (UID: \"7bb1fd59-5e3e-4711-83cf-c5cf2ec7622c\") " pod="openshift-multus/network-metrics-daemon-jl9tj" Mar 08 03:18:03.774046 master-0 kubenswrapper[13046]: I0308 03:18:03.773993 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92" (OuterVolumeSpecName: "kube-api-access-qgw92") pod "7e324f6c-ee4c-42bc-b241-9c6938749854" (UID: "7e324f6c-ee4c-42bc-b241-9c6938749854"). InnerVolumeSpecName "kube-api-access-qgw92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:03.790627 master-0 kubenswrapper[13046]: I0308 03:18:03.790275 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"redhat-marketplace-nggbb\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " pod="openshift-marketplace/redhat-marketplace-nggbb" Mar 08 03:18:03.811016 master-0 kubenswrapper[13046]: I0308 03:18:03.810888 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af653e87-ce5f-4f1a-a20d-233c563694ba-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-2z74v\" (UID: \"af653e87-ce5f-4f1a-a20d-233c563694ba\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" Mar 08 03:18:03.843516 master-0 kubenswrapper[13046]: I0308 03:18:03.840352 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-982r4\" (UniqueName: \"kubernetes.io/projected/febf6a91-8b78-4b22-93b9-155cb7761fc4-kube-api-access-982r4\") pod \"packageserver-5675c97455-jxfcz\" (UID: \"febf6a91-8b78-4b22-93b9-155cb7761fc4\") " pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:03.848183 master-0 kubenswrapper[13046]: I0308 03:18:03.848145 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jsw\" (UniqueName: \"kubernetes.io/projected/50ab8f71-42b8-4967-8a0b-016647c59a37-kube-api-access-h9jsw\") pod \"certified-operators-jnlct\" (UID: \"50ab8f71-42b8-4967-8a0b-016647c59a37\") " pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:03.865021 master-0 kubenswrapper[13046]: I0308 03:18:03.864978 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtt8w\" (UniqueName: \"kubernetes.io/projected/fe33f926-9348-4498-a892-d2becaeecc14-kube-api-access-rtt8w\") pod \"machine-config-operator-fdb5c78b5-dddvl\" (UID: \"fe33f926-9348-4498-a892-d2becaeecc14\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" Mar 08 03:18:03.871715 master-0 kubenswrapper[13046]: I0308 03:18:03.871567 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") pod \"1a6e3f01-0f22-4961-b450-56aca5477943\" (UID: \"1a6e3f01-0f22-4961-b450-56aca5477943\") " Mar 08 03:18:03.872701 master-0 kubenswrapper[13046]: I0308 03:18:03.872667 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgw92\" (UniqueName: \"kubernetes.io/projected/7e324f6c-ee4c-42bc-b241-9c6938749854-kube-api-access-qgw92\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:03.874264 master-0 kubenswrapper[13046]: I0308 03:18:03.874223 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt" (OuterVolumeSpecName: "kube-api-access-qj8dt") pod "1a6e3f01-0f22-4961-b450-56aca5477943" (UID: "1a6e3f01-0f22-4961-b450-56aca5477943"). InnerVolumeSpecName "kube-api-access-qj8dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:03.887464 master-0 kubenswrapper[13046]: I0308 03:18:03.887411 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlj9x\" (UniqueName: \"kubernetes.io/projected/4108f513-acef-473a-ab03-f3761b2bd0d8-kube-api-access-qlj9x\") pod \"cluster-monitoring-operator-674cbfbd9d-gj775\" (UID: \"4108f513-acef-473a-ab03-f3761b2bd0d8\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-gj775" Mar 08 03:18:03.908266 master-0 kubenswrapper[13046]: I0308 03:18:03.908134 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hb9\" (UniqueName: \"kubernetes.io/projected/cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd-kube-api-access-q9hb9\") pod \"ovnkube-node-krdvz\" (UID: \"cceeebd6-19f6-4a3a-a1eb-4ee1174a8cbd\") " pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:03.942185 master-0 kubenswrapper[13046]: I0308 03:18:03.942130 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27xv\" (UniqueName: \"kubernetes.io/projected/c3729e29-4c57-4f9b-8202-a87fd3a9a722-kube-api-access-s27xv\") pod \"machine-api-operator-84bf6db4f9-qt654\" (UID: \"c3729e29-4c57-4f9b-8202-a87fd3a9a722\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" Mar 08 03:18:03.954606 master-0 kubenswrapper[13046]: I0308 03:18:03.954565 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"controller-manager-6494b94d74-kwkcq\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:03.969463 master-0 kubenswrapper[13046]: I0308 03:18:03.969409 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vvq\" (UniqueName: \"kubernetes.io/projected/6a9d0240-fc00-4d78-9458-8f53b1876f1b-kube-api-access-b2vvq\") pod \"redhat-operators-zm8fd\" (UID: \"6a9d0240-fc00-4d78-9458-8f53b1876f1b\") " pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:03.972772 master-0 kubenswrapper[13046]: I0308 03:18:03.972542 13046 scope.go:117] "RemoveContainer" containerID="5cb8f3acbb7aa9ec545c1b8e4b064d16cbafd48b223783d78db54ee94e2fb56a" Mar 08 03:18:03.973606 master-0 kubenswrapper[13046]: I0308 03:18:03.973570 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj8dt\" (UniqueName: \"kubernetes.io/projected/1a6e3f01-0f22-4961-b450-56aca5477943-kube-api-access-qj8dt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:03.984421 master-0 kubenswrapper[13046]: I0308 03:18:03.983663 13046 scope.go:117] "RemoveContainer" containerID="37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf" Mar 08 03:18:03.985545 master-0 kubenswrapper[13046]: I0308 03:18:03.985503 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxwl\" (UniqueName: \"kubernetes.io/projected/306b824f-dcfb-4e69-9a23-64dfbae61852-kube-api-access-4pxwl\") pod \"migrator-57ccdf9b5-xps42\" (UID: \"306b824f-dcfb-4e69-9a23-64dfbae61852\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-xps42" Mar 08 03:18:03.993294 master-0 kubenswrapper[13046]: I0308 03:18:03.993104 13046 scope.go:117] "RemoveContainer" containerID="6c59d77b77a1f89b306ddf4cc0f2bd1da0d815a10de107029f05b136ace17ea9" Mar 08 03:18:04.002635 master-0 kubenswrapper[13046]: I0308 03:18:04.002605 13046 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:18:04.012007 master-0 kubenswrapper[13046]: I0308 03:18:04.011955 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5ll\" (UniqueName: \"kubernetes.io/projected/4f822854-b9ac-46f2-b03b-e7215fba9208-kube-api-access-rk5ll\") pod \"olm-operator-d64cfc9db-pdgmg\" (UID: \"4f822854-b9ac-46f2-b03b-e7215fba9208\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:04.026931 master-0 kubenswrapper[13046]: I0308 03:18:04.026544 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:04.028273 master-0 kubenswrapper[13046]: I0308 03:18:04.028241 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xsp\" (UniqueName: \"kubernetes.io/projected/dfe0357f-dab4-4424-869c-f6070b411a35-kube-api-access-w5xsp\") pod \"node-resolver-lmqn7\" (UID: \"dfe0357f-dab4-4424-869c-f6070b411a35\") " pod="openshift-dns/node-resolver-lmqn7" Mar 08 03:18:04.037992 master-0 kubenswrapper[13046]: I0308 03:18:04.035452 13046 scope.go:117] "RemoveContainer" containerID="a31cf751005d98b0c093a07cba9d36fdd0b091f0fc3e6728bcde1b51934cdbef" Mar 08 03:18:04.050135 master-0 kubenswrapper[13046]: I0308 03:18:04.047597 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf57p\" (UniqueName: \"kubernetes.io/projected/c0a08ddb-1045-4631-ba52-93f3046ebd0a-kube-api-access-rf57p\") pod \"service-ca-operator-69b6fc6b88-57b4v\" (UID: \"c0a08ddb-1045-4631-ba52-93f3046ebd0a\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" Mar 08 03:18:04.077144 master-0 kubenswrapper[13046]: I0308 03:18:04.077106 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpwx\" (UniqueName: \"kubernetes.io/projected/caa3a50c-1291-4152-a48a-f7c7b49627db-kube-api-access-6bpwx\") pod \"cloud-credential-operator-55d85b7b47-rggnq\" (UID: \"caa3a50c-1291-4152-a48a-f7c7b49627db\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" Mar 08 03:18:04.089508 master-0 kubenswrapper[13046]: I0308 03:18:04.088038 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8wt\" (UniqueName: \"kubernetes.io/projected/a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6-kube-api-access-pv8wt\") pod \"network-node-identity-xjg74\" (UID: \"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6\") " pod="openshift-network-node-identity/network-node-identity-xjg74" Mar 08 03:18:04.108458 master-0 kubenswrapper[13046]: E0308 03:18:04.108418 13046 projected.go:288] Couldn't get configMap openshift-kube-scheduler/kube-root-ca.crt: object "openshift-kube-scheduler"/"kube-root-ca.crt" not registered Mar 08 03:18:04.108458 master-0 kubenswrapper[13046]: E0308 03:18:04.108449 13046 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-scheduler/installer-4-master-0: object "openshift-kube-scheduler"/"kube-root-ca.crt" not registered Mar 08 03:18:04.108630 master-0 kubenswrapper[13046]: E0308 03:18:04.108512 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access podName:acb74744-fb99-4663-a7d0-7bae2db205e9 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:04.608493865 +0000 UTC m=+286.687261082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access") pod "installer-4-master-0" (UID: "acb74744-fb99-4663-a7d0-7bae2db205e9") : object "openshift-kube-scheduler"/"kube-root-ca.crt" not registered Mar 08 03:18:04.126248 master-0 kubenswrapper[13046]: I0308 03:18:04.126205 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvzs9\" (UniqueName: \"kubernetes.io/projected/b33ed2de-435b-4ccc-8dfd-29d52bf95ea8-kube-api-access-fvzs9\") pod \"insights-operator-8f89dfddd-zd6kq\" (UID: \"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8\") " pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" Mar 08 03:18:04.154829 master-0 kubenswrapper[13046]: I0308 03:18:04.154777 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk4m\" (UniqueName: \"kubernetes.io/projected/6432d23b-a55a-4131-83d5-5f16419809dd-kube-api-access-7jk4m\") pod \"openshift-apiserver-operator-799b6db4d7-krqcr\" (UID: \"6432d23b-a55a-4131-83d5-5f16419809dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" Mar 08 03:18:04.176516 master-0 kubenswrapper[13046]: I0308 03:18:04.176383 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nwgh\" (UniqueName: \"kubernetes.io/projected/8e1af4e8-2ade-48b3-8c56-0ab78f77fac9-kube-api-access-4nwgh\") pod \"operator-controller-controller-manager-6598bfb6c4-62spv\" (UID: \"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:04.187115 master-0 kubenswrapper[13046]: I0308 03:18:04.185520 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") pod \"acb74744-fb99-4663-a7d0-7bae2db205e9\" (UID: \"acb74744-fb99-4663-a7d0-7bae2db205e9\") " Mar 08 03:18:04.188811 master-0 kubenswrapper[13046]: I0308 03:18:04.187494 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25w4\" (UniqueName: \"kubernetes.io/projected/aa781f72-e72f-47e1-b37a-977340c182c8-kube-api-access-b25w4\") pod \"network-check-target-l5x6h\" (UID: \"aa781f72-e72f-47e1-b37a-977340c182c8\") " pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:18:04.205534 master-0 kubenswrapper[13046]: I0308 03:18:04.205344 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhxt\" (UniqueName: \"kubernetes.io/projected/3bf93333-b537-4f23-9c77-6a245b290fe3-kube-api-access-trhxt\") pod \"openshift-config-operator-64488f9d78-zg4zr\" (UID: \"3bf93333-b537-4f23-9c77-6a245b290fe3\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:04.210459 master-0 kubenswrapper[13046]: I0308 03:18:04.210388 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "acb74744-fb99-4663-a7d0-7bae2db205e9" (UID: "acb74744-fb99-4663-a7d0-7bae2db205e9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.226633 master-0 kubenswrapper[13046]: I0308 03:18:04.226285 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422p2\" (UniqueName: \"kubernetes.io/projected/d358134e-5625-492c-b4f7-460798631270-kube-api-access-422p2\") pod \"ovnkube-control-plane-66b55d57d-f4742\" (UID: \"d358134e-5625-492c-b4f7-460798631270\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" Mar 08 03:18:04.246065 master-0 kubenswrapper[13046]: I0308 03:18:04.245760 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7t9\" (UniqueName: \"kubernetes.io/projected/9cf6ce1a-c203-4033-86be-be16694a9062-kube-api-access-ml7t9\") pod \"redhat-marketplace-d5qh2\" (UID: \"9cf6ce1a-c203-4033-86be-be16694a9062\") " pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:04.268358 master-0 kubenswrapper[13046]: I0308 03:18:04.267925 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"certified-operators-r494d\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " pod="openshift-marketplace/certified-operators-r494d" Mar 08 03:18:04.284017 master-0 kubenswrapper[13046]: I0308 03:18:04.283974 13046 scope.go:117] "RemoveContainer" containerID="1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22" Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.284770 13046 scope.go:117] "RemoveContainer" containerID="d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a" Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.285247 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.285319 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.285981 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.286019 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.286424 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") pod \"b05d5093-20f4-42d5-9db3-811e049cc1b6\" (UID: \"b05d5093-20f4-42d5-9db3-811e049cc1b6\") " Mar 08 03:18:04.287230 master-0 kubenswrapper[13046]: I0308 03:18:04.286916 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acb74744-fb99-4663-a7d0-7bae2db205e9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:04.287965 master-0 kubenswrapper[13046]: I0308 03:18:04.287871 13046 scope.go:117] "RemoveContainer" containerID="723615f545a9b912d96e2b20f5beb286b3ce93e38e0a010ef0152a7b0e0c1b1e" Mar 08 03:18:04.291349 master-0 kubenswrapper[13046]: I0308 03:18:04.288364 13046 scope.go:117] "RemoveContainer" containerID="0f90c7e80ee619a77867feffa666b20dfa8fad2e9ecc5d700b999460ff6d737b" Mar 08 03:18:04.291349 master-0 kubenswrapper[13046]: I0308 03:18:04.289644 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pfx\" (UniqueName: \"kubernetes.io/projected/982ea338-c7be-4776-9bb7-113834c54aaa-kube-api-access-r8pfx\") pod \"network-operator-7c649bf6d4-98n6d\" (UID: \"982ea338-c7be-4776-9bb7-113834c54aaa\") " pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" Mar 08 03:18:04.291349 master-0 kubenswrapper[13046]: I0308 03:18:04.290593 13046 scope.go:117] "RemoveContainer" containerID="115308b4e38a50965cda00a6f3da9ba63adca456afd5e8dd547096a0f49ebb12" Mar 08 03:18:04.299757 master-0 kubenswrapper[13046]: I0308 03:18:04.297231 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs" (OuterVolumeSpecName: "kube-api-access-ddxbs") pod "b05d5093-20f4-42d5-9db3-811e049cc1b6" (UID: "b05d5093-20f4-42d5-9db3-811e049cc1b6"). InnerVolumeSpecName "kube-api-access-ddxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.308911 master-0 kubenswrapper[13046]: I0308 03:18:04.307755 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2rs\" (UniqueName: \"kubernetes.io/projected/de90d207-06d6-4778-b1b0-9020b1f2a881-kube-api-access-lh2rs\") pod \"tuned-ntxqg\" (UID: \"de90d207-06d6-4778-b1b0-9020b1f2a881\") " pod="openshift-cluster-node-tuning-operator/tuned-ntxqg" Mar 08 03:18:04.328749 master-0 kubenswrapper[13046]: I0308 03:18:04.328569 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"multus-admission-controller-8d675b596-772zs\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:18:04.334689 master-0 kubenswrapper[13046]: I0308 03:18:04.332374 13046 request.go:700] Waited for 2.876377269s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/serviceaccounts/cluster-storage-operator/token Mar 08 03:18:04.348803 master-0 kubenswrapper[13046]: I0308 03:18:04.348508 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97fm\" (UniqueName: \"kubernetes.io/projected/0e569889-4759-4046-b0ed-e550078521c6-kube-api-access-m97fm\") pod \"cluster-storage-operator-6fbfc8dc8f-4qmzb\" (UID: \"0e569889-4759-4046-b0ed-e550078521c6\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" Mar 08 03:18:04.371711 master-0 kubenswrapper[13046]: I0308 03:18:04.365374 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskxf\" (UniqueName: \"kubernetes.io/projected/53254b19-b5b3-4f97-bc64-37be8b2a41b7-kube-api-access-dskxf\") pod \"catalogd-controller-manager-7f8b8b6f4c-gdwg9\" (UID: \"53254b19-b5b3-4f97-bc64-37be8b2a41b7\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:04.390228 master-0 kubenswrapper[13046]: I0308 03:18:04.388625 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddxbs\" (UniqueName: \"kubernetes.io/projected/b05d5093-20f4-42d5-9db3-811e049cc1b6-kube-api-access-ddxbs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:04.394298 master-0 kubenswrapper[13046]: I0308 03:18:04.393985 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"route-controller-manager-6fbc9556d8-l758n\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:04.410431 master-0 kubenswrapper[13046]: I0308 03:18:04.410404 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhh9\" (UniqueName: \"kubernetes.io/projected/c9f377bf-79c5-4425-b5d1-256961835f62-kube-api-access-6nhh9\") pod \"service-ca-84bfdbbb7f-fqhlq\" (UID: \"c9f377bf-79c5-4425-b5d1-256961835f62\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" Mar 08 03:18:04.428858 master-0 kubenswrapper[13046]: I0308 03:18:04.428822 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt69c\" (UniqueName: \"kubernetes.io/projected/33c15b06-a21e-411f-b324-3ae0c7f0e9a4-kube-api-access-qt69c\") pod \"cluster-samples-operator-664cb58b85-lhvrm\" (UID: \"33c15b06-a21e-411f-b324-3ae0c7f0e9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" Mar 08 03:18:04.444523 master-0 kubenswrapper[13046]: I0308 03:18:04.444470 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntd2k\" (UniqueName: \"kubernetes.io/projected/dac2b210-2fbb-4d25-a0ea-1825259cee3b-kube-api-access-ntd2k\") pod \"apiserver-fb55b5d5d-pm69n\" (UID: \"dac2b210-2fbb-4d25-a0ea-1825259cee3b\") " pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:04.469160 master-0 kubenswrapper[13046]: I0308 03:18:04.469107 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwh7\" (UniqueName: \"kubernetes.io/projected/1092f2a6-865c-4706-bba7-068621e85ebc-kube-api-access-llwh7\") pod \"machine-config-daemon-j6n9g\" (UID: \"1092f2a6-865c-4706-bba7-068621e85ebc\") " pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:04.486506 master-0 kubenswrapper[13046]: I0308 03:18:04.485842 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqnd\" (UniqueName: \"kubernetes.io/projected/f08a644f-3b61-46a7-a7b6-a9f7f2f7d266-kube-api-access-hxqnd\") pod \"authentication-operator-7c6989d6c4-zqlnx\" (UID: \"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:04.507236 master-0 kubenswrapper[13046]: W0308 03:18:04.507138 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd07fa0_00f3_4267_b64a_1e7c02fdf148.slice/crio-0f7cf2e7c5274f939368432574755e49e40aa59b7ea66be91ed8d72957e680b6 WatchSource:0}: Error finding container 0f7cf2e7c5274f939368432574755e49e40aa59b7ea66be91ed8d72957e680b6: Status 404 returned error can't find the container with id 0f7cf2e7c5274f939368432574755e49e40aa59b7ea66be91ed8d72957e680b6 Mar 08 03:18:04.540106 master-0 kubenswrapper[13046]: I0308 03:18:04.540059 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzmjd\" (UniqueName: \"kubernetes.io/projected/555ae3b4-71c6-4b62-9e09-66a58ae4c6ad-kube-api-access-rzmjd\") pod \"csi-snapshot-controller-7577d6f48-j6jpn\" (UID: \"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" Mar 08 03:18:04.549649 master-0 kubenswrapper[13046]: I0308 03:18:04.549604 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llf9g\" (UniqueName: \"kubernetes.io/projected/bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c-kube-api-access-llf9g\") pod \"cluster-node-tuning-operator-66c7586884-nnd8x\" (UID: \"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" Mar 08 03:18:04.560825 master-0 kubenswrapper[13046]: E0308 03:18:04.560756 13046 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:18:04.560825 master-0 kubenswrapper[13046]: E0308 03:18:04.560793 13046 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:18:04.561049 master-0 kubenswrapper[13046]: E0308 03:18:04.560850 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access podName:7ea81472-8a81-45ec-a07d-8710f47a927d nodeName:}" failed. No retries permitted until 2026-03-08 03:18:05.060830109 +0000 UTC m=+287.139597406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access") pod "installer-1-master-0" (UID: "7ea81472-8a81-45ec-a07d-8710f47a927d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 08 03:18:04.572255 master-0 kubenswrapper[13046]: I0308 03:18:04.572215 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpd47\" (UniqueName: \"kubernetes.io/projected/c31f7cee-d21d-4c23-af9b-1e0180b12e1e-kube-api-access-xpd47\") pod \"multus-admission-controller-7769569c45-f85rr\" (UID: \"c31f7cee-d21d-4c23-af9b-1e0180b12e1e\") " pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:04.582790 master-0 kubenswrapper[13046]: I0308 03:18:04.582744 13046 scope.go:117] "RemoveContainer" containerID="6a754fcfb0d67c328aad3537f5cd3aea4c5a542bc823d6a29cf5e7022aa42ed0" Mar 08 03:18:04.583048 master-0 kubenswrapper[13046]: I0308 03:18:04.583016 13046 scope.go:117] "RemoveContainer" containerID="d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386" Mar 08 03:18:04.586476 master-0 kubenswrapper[13046]: I0308 03:18:04.586447 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-bound-sa-token\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:04.588246 master-0 kubenswrapper[13046]: I0308 03:18:04.588158 13046 scope.go:117] "RemoveContainer" containerID="34a41043128393510c095711912036e3de6953d35852c470aeee13ef6010b118" Mar 08 03:18:04.588566 master-0 kubenswrapper[13046]: I0308 03:18:04.588545 13046 scope.go:117] "RemoveContainer" containerID="d4ea1844b53b95e64939abf18bf680af5d21c94a78af3eaf8fa2b814c48bf2f0" Mar 08 03:18:04.591828 master-0 kubenswrapper[13046]: I0308 03:18:04.591645 13046 scope.go:117] "RemoveContainer" containerID="95cb1ab0414f6248676ceab0da8402d36a93f6fced2ddcec794373deb0d0db80" Mar 08 03:18:04.597074 master-0 kubenswrapper[13046]: I0308 03:18:04.595870 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") pod \"7ea81472-8a81-45ec-a07d-8710f47a927d\" (UID: \"7ea81472-8a81-45ec-a07d-8710f47a927d\") " Mar 08 03:18:04.599166 master-0 kubenswrapper[13046]: I0308 03:18:04.599062 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7ea81472-8a81-45ec-a07d-8710f47a927d" (UID: "7ea81472-8a81-45ec-a07d-8710f47a927d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.606504 master-0 kubenswrapper[13046]: I0308 03:18:04.605606 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c5z\" (UniqueName: \"kubernetes.io/projected/9949f9f4-00f3-4ac8-b8a2-a9549693f5b1-kube-api-access-d5c5z\") pod \"multus-hfnwm\" (UID: \"9949f9f4-00f3-4ac8-b8a2-a9549693f5b1\") " pod="openshift-multus/multus-hfnwm" Mar 08 03:18:04.626077 master-0 kubenswrapper[13046]: I0308 03:18:04.626039 13046 scope.go:117] "RemoveContainer" containerID="ad59cc4c7958a82cb7e8357828383997f6ce39b4d62e09c7ada95209a7513c90" Mar 08 03:18:04.633049 master-0 kubenswrapper[13046]: I0308 03:18:04.633014 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"cluster-cloud-controller-manager-operator-559568b945-kkm7z\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:18:04.633621 master-0 kubenswrapper[13046]: I0308 03:18:04.633590 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:18:04.638723 master-0 kubenswrapper[13046]: I0308 03:18:04.638662 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" Mar 08 03:18:04.653162 master-0 kubenswrapper[13046]: I0308 03:18:04.652707 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dn9\" (UniqueName: \"kubernetes.io/projected/8f99f81a-fd2d-432e-a3bc-e451342650b1-kube-api-access-s8dn9\") pod \"dns-operator-589895fbb7-z45kw\" (UID: \"8f99f81a-fd2d-432e-a3bc-e451342650b1\") " pod="openshift-dns-operator/dns-operator-589895fbb7-z45kw" Mar 08 03:18:04.669791 master-0 kubenswrapper[13046]: E0308 03:18:04.669256 13046 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: object "openshift-cluster-machine-approver"/"kube-root-ca.crt" not registered Mar 08 03:18:04.669791 master-0 kubenswrapper[13046]: E0308 03:18:04.669307 13046 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/openshift-service-ca.crt: object "openshift-cluster-machine-approver"/"openshift-service-ca.crt" not registered Mar 08 03:18:04.669791 master-0 kubenswrapper[13046]: E0308 03:18:04.669320 13046 projected.go:194] Error preparing data for projected volume kube-api-access-6bcd7 for pod openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4: [object "openshift-cluster-machine-approver"/"kube-root-ca.crt" not registered, object "openshift-cluster-machine-approver"/"openshift-service-ca.crt" not registered] Mar 08 03:18:04.669791 master-0 kubenswrapper[13046]: E0308 03:18:04.669386 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7 podName:234638fe-5577-45bc-9094-907c5611da38 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:05.169353196 +0000 UTC m=+287.248120413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6bcd7" (UniqueName: "kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7") pod "machine-approver-955fcfb87-jxrq4" (UID: "234638fe-5577-45bc-9094-907c5611da38") : [object "openshift-cluster-machine-approver"/"kube-root-ca.crt" not registered, object "openshift-cluster-machine-approver"/"openshift-service-ca.crt" not registered] Mar 08 03:18:04.686188 master-0 kubenswrapper[13046]: I0308 03:18:04.686151 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58bm\" (UniqueName: \"kubernetes.io/projected/68309159-130a-4ffa-acec-95dc4b795b8f-kube-api-access-k58bm\") pod \"community-operators-zm92r\" (UID: \"68309159-130a-4ffa-acec-95dc4b795b8f\") " pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:04.697187 master-0 kubenswrapper[13046]: I0308 03:18:04.697091 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") pod \"234638fe-5577-45bc-9094-907c5611da38\" (UID: \"234638fe-5577-45bc-9094-907c5611da38\") " Mar 08 03:18:04.697412 master-0 kubenswrapper[13046]: I0308 03:18:04.697383 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ea81472-8a81-45ec-a07d-8710f47a927d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:04.717272 master-0 kubenswrapper[13046]: I0308 03:18:04.717231 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zh5b\" (UniqueName: \"kubernetes.io/projected/fd6b827c-70b0-47ed-b07c-c696343248a8-kube-api-access-8zh5b\") pod \"ingress-operator-677db989d6-r9m2k\" (UID: \"fd6b827c-70b0-47ed-b07c-c696343248a8\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" Mar 08 03:18:04.719081 master-0 kubenswrapper[13046]: I0308 03:18:04.719037 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7" (OuterVolumeSpecName: "kube-api-access-6bcd7") pod "234638fe-5577-45bc-9094-907c5611da38" (UID: "234638fe-5577-45bc-9094-907c5611da38"). InnerVolumeSpecName "kube-api-access-6bcd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.737249 master-0 kubenswrapper[13046]: I0308 03:18:04.737202 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b83ab56c-e28d-4e82-ae8f-92649a1448ed-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-hw2kt\" (UID: \"b83ab56c-e28d-4e82-ae8f-92649a1448ed\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" Mar 08 03:18:04.739067 master-0 kubenswrapper[13046]: I0308 03:18:04.738452 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/0.log" Mar 08 03:18:04.752242 master-0 kubenswrapper[13046]: I0308 03:18:04.752205 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qqr\" (UniqueName: \"kubernetes.io/projected/76ceb013-e999-4f15-bf25-f8dcd2647f9f-kube-api-access-s8qqr\") pod \"multus-additional-cni-plugins-5qjn5\" (UID: \"76ceb013-e999-4f15-bf25-f8dcd2647f9f\") " pod="openshift-multus/multus-additional-cni-plugins-5qjn5" Mar 08 03:18:04.775866 master-0 kubenswrapper[13046]: W0308 03:18:04.771574 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1092f2a6_865c_4706_bba7_068621e85ebc.slice/crio-71b99594a28a0ded9c84cbe541990e78ed4295d37bb5353fd4aa2f5819a522ca WatchSource:0}: Error finding container 71b99594a28a0ded9c84cbe541990e78ed4295d37bb5353fd4aa2f5819a522ca: Status 404 returned error can't find the container with id 71b99594a28a0ded9c84cbe541990e78ed4295d37bb5353fd4aa2f5819a522ca Mar 08 03:18:04.785586 master-0 kubenswrapper[13046]: E0308 03:18:04.779563 13046 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 08 03:18:04.785586 master-0 kubenswrapper[13046]: E0308 03:18:04.779616 13046 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 08 03:18:04.785586 master-0 kubenswrapper[13046]: E0308 03:18:04.779678 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access podName:0781e6af-f5b5-40f7-bb7f-5bc6978b4957 nodeName:}" failed. No retries permitted until 2026-03-08 03:18:05.279652651 +0000 UTC m=+287.358419868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access") pod "installer-2-master-0" (UID: "0781e6af-f5b5-40f7-bb7f-5bc6978b4957") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 08 03:18:04.797932 master-0 kubenswrapper[13046]: I0308 03:18:04.797895 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") pod \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\" (UID: \"0781e6af-f5b5-40f7-bb7f-5bc6978b4957\") " Mar 08 03:18:04.798248 master-0 kubenswrapper[13046]: I0308 03:18:04.798223 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bcd7\" (UniqueName: \"kubernetes.io/projected/234638fe-5577-45bc-9094-907c5611da38-kube-api-access-6bcd7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:04.798438 master-0 kubenswrapper[13046]: I0308 03:18:04.798400 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/1.log" Mar 08 03:18:04.798581 master-0 kubenswrapper[13046]: I0308 03:18:04.798543 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556dx\" (UniqueName: \"kubernetes.io/projected/f99d6808-9fec-402d-93f7-41575a5a0a08-kube-api-access-556dx\") pod \"apiserver-778796f487-vzb5n\" (UID: \"f99d6808-9fec-402d-93f7-41575a5a0a08\") " pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:04.806292 master-0 kubenswrapper[13046]: I0308 03:18:04.805417 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/0.log" Mar 08 03:18:04.806292 master-0 kubenswrapper[13046]: I0308 03:18:04.805462 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" containerID="a2120db451b5376796065b553810bc8500ecbfec6f711b80ba3aef0fdc6a5c29" exitCode=1 Mar 08 03:18:04.812631 master-0 kubenswrapper[13046]: I0308 03:18:04.812409 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0781e6af-f5b5-40f7-bb7f-5bc6978b4957" (UID: "0781e6af-f5b5-40f7-bb7f-5bc6978b4957"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.822022 master-0 kubenswrapper[13046]: I0308 03:18:04.821942 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4bq\" (UniqueName: \"kubernetes.io/projected/275be8d3-df30-46f7-9d0a-806e404dfd57-kube-api-access-fb4bq\") pod \"iptables-alerter-g86jc\" (UID: \"275be8d3-df30-46f7-9d0a-806e404dfd57\") " pod="openshift-network-operator/iptables-alerter-g86jc" Mar 08 03:18:04.836500 master-0 kubenswrapper[13046]: I0308 03:18:04.829650 13046 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="873a972e72df02e333cdd5be8d4415642ae4a31a8ef844e8221962cd437b0309" exitCode=0 Mar 08 03:18:04.836500 master-0 kubenswrapper[13046]: I0308 03:18:04.830187 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885mp\" (UniqueName: \"kubernetes.io/projected/bda3bd48-6de3-49b0-b2ce-96d97e97f178-kube-api-access-885mp\") pod \"dns-default-htnv4\" (UID: \"bda3bd48-6de3-49b0-b2ce-96d97e97f178\") " pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:04.848468 master-0 kubenswrapper[13046]: E0308 03:18:04.848435 13046 projected.go:288] Couldn't get configMap openshift-etcd/kube-root-ca.crt: object "openshift-etcd"/"kube-root-ca.crt" not registered Mar 08 03:18:04.848575 master-0 kubenswrapper[13046]: E0308 03:18:04.848565 13046 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-etcd/installer-1-master-0: object "openshift-etcd"/"kube-root-ca.crt" not registered Mar 08 03:18:04.848735 master-0 kubenswrapper[13046]: E0308 03:18:04.848713 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access podName:2dc664e3-7f37-4fba-8104-544ffb18c1bd nodeName:}" failed. No retries permitted until 2026-03-08 03:18:05.348601149 +0000 UTC m=+287.427368356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access") pod "installer-1-master-0" (UID: "2dc664e3-7f37-4fba-8104-544ffb18c1bd") : object "openshift-etcd"/"kube-root-ca.crt" not registered Mar 08 03:18:04.872744 master-0 kubenswrapper[13046]: I0308 03:18:04.872705 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84q5n\" (UniqueName: \"kubernetes.io/projected/5a2c9576-f7bd-4ac5-a7fe-530f26642f97-kube-api-access-84q5n\") pod \"control-plane-machine-set-operator-6686554ddc-gwnnd\" (UID: \"5a2c9576-f7bd-4ac5-a7fe-530f26642f97\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" Mar 08 03:18:04.882585 master-0 kubenswrapper[13046]: I0308 03:18:04.881080 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/1.log" Mar 08 03:18:04.882585 master-0 kubenswrapper[13046]: I0308 03:18:04.881801 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/0.log" Mar 08 03:18:04.882585 master-0 kubenswrapper[13046]: I0308 03:18:04.881895 13046 kubelet_pods.go:1320] "Clean up containers for orphaned pod we had not seen before" podUID="5f77c8e18b751d90bc0dfe2d4e304050" killPodOptions="" Mar 08 03:18:04.882585 master-0 kubenswrapper[13046]: I0308 03:18:04.882097 13046 generic.go:334] "Generic (PLEG): container finished" podID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" exitCode=1 Mar 08 03:18:04.882585 master-0 kubenswrapper[13046]: I0308 03:18:04.882258 13046 scope.go:117] "RemoveContainer" containerID="927e976b2419f80e2b156dd6620627f0ab5b15535fdab986491afec086084730" Mar 08 03:18:04.886497 master-0 kubenswrapper[13046]: I0308 03:18:04.883432 13046 scope.go:117] "RemoveContainer" containerID="ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f" Mar 08 03:18:04.909537 master-0 kubenswrapper[13046]: I0308 03:18:04.906183 13046 scope.go:117] "RemoveContainer" containerID="27ab0f00e980c7d4d9fcf7e8c62f276ea49b975eb80fef82536adf6bfc74a796" Mar 08 03:18:04.909537 master-0 kubenswrapper[13046]: I0308 03:18:04.906571 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") pod \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\" (UID: \"2dc664e3-7f37-4fba-8104-544ffb18c1bd\") " Mar 08 03:18:04.909537 master-0 kubenswrapper[13046]: I0308 03:18:04.906937 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0781e6af-f5b5-40f7-bb7f-5bc6978b4957-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:04.912038 master-0 kubenswrapper[13046]: I0308 03:18:04.911987 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2dc664e3-7f37-4fba-8104-544ffb18c1bd" (UID: "2dc664e3-7f37-4fba-8104-544ffb18c1bd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: E0308 03:18:04.924917 13046 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.807s" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925007 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925030 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925065 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925076 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nggbb" event={"ID":"1a6e3f01-0f22-4961-b450-56aca5477943","Type":"ContainerDied","Data":"774988f3ef3d0b52b92bf1ebf44629f520bddc0f247789f3e3c1508bfcf443ed"} Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925113 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925127 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925142 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925150 13046 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="199ae602-a267-48d4-bfaf-162ba27cf027" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925163 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925170 13046 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="199ae602-a267-48d4-bfaf-162ba27cf027" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925180 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:04.926619 master-0 kubenswrapper[13046]: I0308 03:18:04.925223 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:04.935230 master-0 kubenswrapper[13046]: I0308 03:18:04.930129 13046 scope.go:117] "RemoveContainer" containerID="efa860cee031eccbf226be40d18bc7a86bd5de050910722eed848d08f9d751a8" Mar 08 03:18:04.935230 master-0 kubenswrapper[13046]: I0308 03:18:04.930531 13046 scope.go:117] "RemoveContainer" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" Mar 08 03:18:04.935230 master-0 kubenswrapper[13046]: E0308 03:18:04.930702 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.980545 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981237 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981254 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"2dc664e3-7f37-4fba-8104-544ffb18c1bd","Type":"ContainerDied","Data":"047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981270 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047da584d2529f2fb501b0d6492e21ff57d43239aa0e696e0881839756608bc9" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981298 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-vsnbw" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981339 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981350 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86z4t" event={"ID":"7e324f6c-ee4c-42bc-b241-9c6938749854","Type":"ContainerDied","Data":"9afe00eb0e4be585ed826e28d4d17caeacf8b464f8497cf5015dec507b7a134f"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981361 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981370 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"acb74744-fb99-4663-a7d0-7bae2db205e9","Type":"ContainerDied","Data":"a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981380 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a338d440c2bb75ea01ff688f80e02a13069937875709444a6d65480d78f19176" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981408 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981435 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981443 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0781e6af-f5b5-40f7-bb7f-5bc6978b4957","Type":"ContainerDied","Data":"5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981453 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f838d68f76fe62ef4db0397f12606cc88f67f5fe18c59d85b5a1981e0647d72" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981461 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r494d" event={"ID":"b05d5093-20f4-42d5-9db3-811e049cc1b6","Type":"ContainerDied","Data":"702795b7a3b9492f17a3552f3377a1320bf2ba8da965c8533a8f5f8dc47e6545"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981480 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981512 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-5675c97455-jxfcz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981540 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981564 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981587 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krdvz" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981596 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerDied","Data":"8a3da09cabdcb126428fcd447defcc99973fd5db3565d3792f66591da1ac8333"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981609 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" event={"ID":"ba9496ed-060e-4118-9da6-89b82bd49263","Type":"ContainerStarted","Data":"d7bbde25e7a29335f2b74975574f130c47ac2f7c63ea1d074484ef4e53a9352d"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981619 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-4vqgc" event={"ID":"e74c8bb2-e063-4b60-b3fe-651aa534d029","Type":"ContainerStarted","Data":"02a271b92f61f920152e928797a8d6f2f286e22239b1948f5c6cc0a6ac93e75d"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981642 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981660 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981677 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981686 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerStarted","Data":"6190833c9f0b16bfe58201dc6ffba8cfe78f629e2c29aa82c2bfdfc57bb89f22"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981698 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerStarted","Data":"1ab4d8b69b478fc835cd8fd3fc069a281c945a4405e054f4403e41111399ed35"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981707 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981716 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-2z74v" event={"ID":"af653e87-ce5f-4f1a-a20d-233c563694ba","Type":"ContainerStarted","Data":"6147e811ddfe841a6ee5c39c575500896e93d71345296d2ccde4bf3c78c95dbd"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981732 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l5x6h" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981741 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981749 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerStarted","Data":"0d4bbc1c7dcdc0d723f39e3f9d1635f91be7a55d561224fda15ad22d37e6f16a"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981758 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerStarted","Data":"1cb76e3cff18055c242f15145cff75d29ff51e57b0a87a830e5705157c3314d3"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981768 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981776 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerDied","Data":"a2120db451b5376796065b553810bc8500ecbfec6f711b80ba3aef0fdc6a5c29"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981794 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-pdgmg" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981801 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerDied","Data":"873a972e72df02e333cdd5be8d4415642ae4a31a8ef844e8221962cd437b0309"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981814 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981823 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerStarted","Data":"8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981831 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981852 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981868 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981875 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerDied","Data":"01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981885 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-dddvl" event={"ID":"fe33f926-9348-4498-a892-d2becaeecc14","Type":"ContainerStarted","Data":"d99d128996a1edea0e2f4a6c82cf064b00a97a35d8aa37d046fbe6458a60bae1"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981896 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981907 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerStarted","Data":"96a9a66bbda3ed971dd7e07c7aa61fb398502ffd9a81746e62e5bce080cd2621"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981916 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981923 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerStarted","Data":"0f7cf2e7c5274f939368432574755e49e40aa59b7ea66be91ed8d72957e680b6"} Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981943 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981965 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981975 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981989 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-htnv4" Mar 08 03:18:04.983498 master-0 kubenswrapper[13046]: I0308 03:18:04.981998 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:04.994831 master-0 kubenswrapper[13046]: I0308 03:18:04.990672 13046 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-7hsbf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" start-of-body= Mar 08 03:18:04.994831 master-0 kubenswrapper[13046]: I0308 03:18:04.990724 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" Mar 08 03:18:05.001069 master-0 kubenswrapper[13046]: I0308 03:18:05.000422 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/0.log" Mar 08 03:18:05.001519 master-0 kubenswrapper[13046]: I0308 03:18:05.001467 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:05.002720 master-0 kubenswrapper[13046]: I0308 03:18:05.002693 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:18:05.004406 master-0 kubenswrapper[13046]: I0308 03:18:05.004371 13046 scope.go:117] "RemoveContainer" containerID="873a972e72df02e333cdd5be8d4415642ae4a31a8ef844e8221962cd437b0309" Mar 08 03:18:05.008995 master-0 kubenswrapper[13046]: I0308 03:18:05.008738 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerStarted","Data":"97b7bdeb275d63cf51a95c68c58ee5ca2e124f9930c59d4a9b4c4dfb86ee0b8c"} Mar 08 03:18:05.014046 master-0 kubenswrapper[13046]: I0308 03:18:05.014013 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dc664e3-7f37-4fba-8104-544ffb18c1bd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:05.024097 master-0 kubenswrapper[13046]: I0308 03:18:05.022300 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerStarted","Data":"38d50ee052cbf3d522047ab11f11ab5aceb271a88779bde26e4fce4c2a1ce3bf"} Mar 08 03:18:05.074329 master-0 kubenswrapper[13046]: I0308 03:18:05.074246 13046 scope.go:117] "RemoveContainer" containerID="a2120db451b5376796065b553810bc8500ecbfec6f711b80ba3aef0fdc6a5c29" Mar 08 03:18:05.074709 master-0 kubenswrapper[13046]: E0308 03:18:05.074673 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:18:05.095297 master-0 kubenswrapper[13046]: I0308 03:18:05.093677 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerStarted","Data":"b70730e593cef55ada178b0edb721bde407e4ab726f8f341648fd239c8fd9e8b"} Mar 08 03:18:05.104709 master-0 kubenswrapper[13046]: I0308 03:18:05.104306 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-f4742" event={"ID":"d358134e-5625-492c-b4f7-460798631270","Type":"ContainerStarted","Data":"e2bc83fc36fd654a9542d92ed58a826b2a26f162caf2134654c45af5870884d0"} Mar 08 03:18:05.106837 master-0 kubenswrapper[13046]: I0308 03:18:05.106199 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerStarted","Data":"59a094af1d0fb4c0580e000ba9579d4683433cc1042d1d4763a483f3bf1e9302"} Mar 08 03:18:05.112553 master-0 kubenswrapper[13046]: I0308 03:18:05.112500 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerStarted","Data":"35eb39f11e0262a7f614f3efd65f912a40d31232432eefb143835967713072aa"} Mar 08 03:18:05.123564 master-0 kubenswrapper[13046]: I0308 03:18:05.123107 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" event={"ID":"c3729e29-4c57-4f9b-8202-a87fd3a9a722","Type":"ContainerStarted","Data":"60dd4b40e6f3723629ce38c6a3f9a1142bcf700ad065935f889877fd83e64912"} Mar 08 03:18:05.146435 master-0 kubenswrapper[13046]: I0308 03:18:05.146356 13046 scope.go:117] "RemoveContainer" containerID="f23bd786497d6c307edb85e8d774c9b8f2223af0ca9dc43c45c0639c00c00251" Mar 08 03:18:05.151653 master-0 kubenswrapper[13046]: I0308 03:18:05.150872 13046 scope.go:117] "RemoveContainer" containerID="8a3da09cabdcb126428fcd447defcc99973fd5db3565d3792f66591da1ac8333" Mar 08 03:18:05.210243 master-0 kubenswrapper[13046]: I0308 03:18:05.209308 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-f85rr"] Mar 08 03:18:05.268114 master-0 kubenswrapper[13046]: W0308 03:18:05.268073 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31f7cee_d21d_4c23_af9b_1e0180b12e1e.slice/crio-65b42de5e2e1b6f62e37c0b36a26d0ffcdff8f4995caa1d47d964576ccb0a8ec WatchSource:0}: Error finding container 65b42de5e2e1b6f62e37c0b36a26d0ffcdff8f4995caa1d47d964576ccb0a8ec: Status 404 returned error can't find the container with id 65b42de5e2e1b6f62e37c0b36a26d0ffcdff8f4995caa1d47d964576ccb0a8ec Mar 08 03:18:05.358188 master-0 kubenswrapper[13046]: I0308 03:18:05.358149 13046 scope.go:117] "RemoveContainer" containerID="9dcf81635de6906146c147e78ec6bda20f98dd55e53a8e7eb4bd3270e962f41e" Mar 08 03:18:05.442339 master-0 kubenswrapper[13046]: I0308 03:18:05.440040 13046 scope.go:117] "RemoveContainer" containerID="643f3b1d5189adb625272097c9d23e7af0847cd627439de5de3ccca7ed7bb060" Mar 08 03:18:05.553616 master-0 kubenswrapper[13046]: I0308 03:18:05.551672 13046 scope.go:117] "RemoveContainer" containerID="67b6371de1e40f11492bdbedad65b4bb4c5dafeb7f94b97c8372fcadf4c1308d" Mar 08 03:18:05.596643 master-0 kubenswrapper[13046]: I0308 03:18:05.596600 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:18:05.599507 master-0 kubenswrapper[13046]: I0308 03:18:05.599468 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86z4t"] Mar 08 03:18:05.650590 master-0 kubenswrapper[13046]: I0308 03:18:05.648214 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:18:05.651595 master-0 kubenswrapper[13046]: I0308 03:18:05.651562 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r494d"] Mar 08 03:18:05.716388 master-0 kubenswrapper[13046]: I0308 03:18:05.713569 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4"] Mar 08 03:18:05.722021 master-0 kubenswrapper[13046]: I0308 03:18:05.721896 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jxrq4"] Mar 08 03:18:06.145710 master-0 kubenswrapper[13046]: I0308 03:18:06.145652 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234638fe-5577-45bc-9094-907c5611da38" path="/var/lib/kubelet/pods/234638fe-5577-45bc-9094-907c5611da38/volumes" Mar 08 03:18:06.146343 master-0 kubenswrapper[13046]: I0308 03:18:06.146320 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e324f6c-ee4c-42bc-b241-9c6938749854" path="/var/lib/kubelet/pods/7e324f6c-ee4c-42bc-b241-9c6938749854/volumes" Mar 08 03:18:06.148604 master-0 kubenswrapper[13046]: I0308 03:18:06.148326 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05d5093-20f4-42d5-9db3-811e049cc1b6" path="/var/lib/kubelet/pods/b05d5093-20f4-42d5-9db3-811e049cc1b6/volumes" Mar 08 03:18:06.152342 master-0 kubenswrapper[13046]: I0308 03:18:06.152318 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/0.log" Mar 08 03:18:06.152423 master-0 kubenswrapper[13046]: I0308 03:18:06.152382 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerStarted","Data":"82344cdd1de2ca9d97e9556224c6c294e5e614051e31021d5bf08c8ea15a927c"} Mar 08 03:18:06.156036 master-0 kubenswrapper[13046]: I0308 03:18:06.156014 13046 scope.go:117] "RemoveContainer" containerID="4408f61b9048ed833e9161e86cec42c8c15221795d207fe82e8f7a4527778dfb" Mar 08 03:18:06.166797 master-0 kubenswrapper[13046]: I0308 03:18:06.166754 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerStarted","Data":"5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b"} Mar 08 03:18:06.175537 master-0 kubenswrapper[13046]: I0308 03:18:06.174396 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerStarted","Data":"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6"} Mar 08 03:18:06.175537 master-0 kubenswrapper[13046]: I0308 03:18:06.174840 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:06.190197 master-0 kubenswrapper[13046]: I0308 03:18:06.180704 13046 scope.go:117] "RemoveContainer" containerID="66dcb2ef9f56c8175e9938f33a7650abc0b5ef0e638ee33a15fd5eee5cc90aba" Mar 08 03:18:06.204496 master-0 kubenswrapper[13046]: I0308 03:18:06.201196 13046 generic.go:334] "Generic (PLEG): container finished" podID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" containerID="59a094af1d0fb4c0580e000ba9579d4683433cc1042d1d4763a483f3bf1e9302" exitCode=0 Mar 08 03:18:06.204496 master-0 kubenswrapper[13046]: I0308 03:18:06.201283 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerDied","Data":"59a094af1d0fb4c0580e000ba9579d4683433cc1042d1d4763a483f3bf1e9302"} Mar 08 03:18:06.204496 master-0 kubenswrapper[13046]: I0308 03:18:06.201783 13046 scope.go:117] "RemoveContainer" containerID="59a094af1d0fb4c0580e000ba9579d4683433cc1042d1d4763a483f3bf1e9302" Mar 08 03:18:06.232240 master-0 kubenswrapper[13046]: I0308 03:18:06.230614 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:18:06.232573 master-0 kubenswrapper[13046]: I0308 03:18:06.232412 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-rggnq" event={"ID":"caa3a50c-1291-4152-a48a-f7c7b49627db","Type":"ContainerStarted","Data":"6407c899f2a1b248474975e37b02764cdb99d98cbf1e92b6719366fc222dfb38"} Mar 08 03:18:06.243932 master-0 kubenswrapper[13046]: I0308 03:18:06.243894 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 08 03:18:06.263965 master-0 kubenswrapper[13046]: I0308 03:18:06.263689 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" event={"ID":"c9f377bf-79c5-4425-b5d1-256961835f62","Type":"ContainerStarted","Data":"f64fa3836d9bdc65eec219bc3a2a3624ab55436c5e1f3236c3b13b54753839d7"} Mar 08 03:18:06.266772 master-0 kubenswrapper[13046]: I0308 03:18:06.266177 13046 generic.go:334] "Generic (PLEG): container finished" podID="9cf6ce1a-c203-4033-86be-be16694a9062" containerID="022c46388755fbf6553ce1c554714202fc9f8043dc3fa8f9885f628555dfbe46" exitCode=0 Mar 08 03:18:06.266772 master-0 kubenswrapper[13046]: I0308 03:18:06.266236 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qh2" event={"ID":"9cf6ce1a-c203-4033-86be-be16694a9062","Type":"ContainerDied","Data":"022c46388755fbf6553ce1c554714202fc9f8043dc3fa8f9885f628555dfbe46"} Mar 08 03:18:06.272896 master-0 kubenswrapper[13046]: I0308 03:18:06.272591 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerStarted","Data":"1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d"} Mar 08 03:18:06.272896 master-0 kubenswrapper[13046]: I0308 03:18:06.272820 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:06.284952 master-0 kubenswrapper[13046]: I0308 03:18:06.284817 13046 generic.go:334] "Generic (PLEG): container finished" podID="50ab8f71-42b8-4967-8a0b-016647c59a37" containerID="9a316939390b06364ea1acf84f271fce9406bcf6aac8571a251819201238c574" exitCode=0 Mar 08 03:18:06.284952 master-0 kubenswrapper[13046]: I0308 03:18:06.284872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnlct" event={"ID":"50ab8f71-42b8-4967-8a0b-016647c59a37","Type":"ContainerDied","Data":"9a316939390b06364ea1acf84f271fce9406bcf6aac8571a251819201238c574"} Mar 08 03:18:06.286047 master-0 kubenswrapper[13046]: I0308 03:18:06.286014 13046 scope.go:117] "RemoveContainer" containerID="fce5e0a13a8a0d48e21a3b4ab57b6f9f5f4d96f3d5aa8a45af37601d35ca1619" Mar 08 03:18:06.301570 master-0 kubenswrapper[13046]: I0308 03:18:06.301370 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" event={"ID":"c31f7cee-d21d-4c23-af9b-1e0180b12e1e","Type":"ContainerStarted","Data":"f461a96bf215dff054e4e33573dbd48589c202def10eaebd348ace0e71bb2bcc"} Mar 08 03:18:06.301570 master-0 kubenswrapper[13046]: I0308 03:18:06.301401 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" event={"ID":"c31f7cee-d21d-4c23-af9b-1e0180b12e1e","Type":"ContainerStarted","Data":"65b42de5e2e1b6f62e37c0b36a26d0ffcdff8f4995caa1d47d964576ccb0a8ec"} Mar 08 03:18:06.306217 master-0 kubenswrapper[13046]: I0308 03:18:06.306177 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm92r" event={"ID":"68309159-130a-4ffa-acec-95dc4b795b8f","Type":"ContainerStarted","Data":"c149b517df2ec45a89c00789c7e4fb7b4ad982330685b93d6467d0ca24a16c74"} Mar 08 03:18:06.310899 master-0 kubenswrapper[13046]: I0308 03:18:06.310510 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/0.log" Mar 08 03:18:06.310899 master-0 kubenswrapper[13046]: I0308 03:18:06.310750 13046 generic.go:334] "Generic (PLEG): container finished" podID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" containerID="12842d6e85f5044777c9e34c897a5f7d9df61dcd7edc5c7f9e3ce75f2cd269a4" exitCode=255 Mar 08 03:18:06.310899 master-0 kubenswrapper[13046]: I0308 03:18:06.310792 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerDied","Data":"12842d6e85f5044777c9e34c897a5f7d9df61dcd7edc5c7f9e3ce75f2cd269a4"} Mar 08 03:18:06.313062 master-0 kubenswrapper[13046]: I0308 03:18:06.313030 13046 scope.go:117] "RemoveContainer" containerID="12842d6e85f5044777c9e34c897a5f7d9df61dcd7edc5c7f9e3ce75f2cd269a4" Mar 08 03:18:06.320390 master-0 kubenswrapper[13046]: I0308 03:18:06.319296 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/0.log" Mar 08 03:18:06.320390 master-0 kubenswrapper[13046]: I0308 03:18:06.319356 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" event={"ID":"5a2c9576-f7bd-4ac5-a7fe-530f26642f97","Type":"ContainerStarted","Data":"79f755e5e79953802136847a8f8f2fbf40c60f7a80b082c23d87184dffad232b"} Mar 08 03:18:06.337721 master-0 kubenswrapper[13046]: I0308 03:18:06.337677 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm8fd" event={"ID":"6a9d0240-fc00-4d78-9458-8f53b1876f1b","Type":"ContainerStarted","Data":"b2832b9d90414c1e17ab12a22a44053807303c51d15dcbe866e2ce8b6dfacc13"} Mar 08 03:18:06.346747 master-0 kubenswrapper[13046]: I0308 03:18:06.343538 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/1.log" Mar 08 03:18:06.349017 master-0 kubenswrapper[13046]: I0308 03:18:06.348988 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/1.log" Mar 08 03:18:06.349996 master-0 kubenswrapper[13046]: I0308 03:18:06.349746 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/0.log" Mar 08 03:18:06.349996 master-0 kubenswrapper[13046]: I0308 03:18:06.349781 13046 generic.go:334] "Generic (PLEG): container finished" podID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" exitCode=1 Mar 08 03:18:06.349996 master-0 kubenswrapper[13046]: I0308 03:18:06.349827 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerDied","Data":"6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108"} Mar 08 03:18:06.349996 master-0 kubenswrapper[13046]: I0308 03:18:06.349856 13046 scope.go:117] "RemoveContainer" containerID="115308b4e38a50965cda00a6f3da9ba63adca456afd5e8dd547096a0f49ebb12" Mar 08 03:18:06.350285 master-0 kubenswrapper[13046]: I0308 03:18:06.350262 13046 scope.go:117] "RemoveContainer" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" Mar 08 03:18:06.350651 master-0 kubenswrapper[13046]: E0308 03:18:06.350458 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:06.361886 master-0 kubenswrapper[13046]: I0308 03:18:06.360709 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" event={"ID":"0e569889-4759-4046-b0ed-e550078521c6","Type":"ContainerStarted","Data":"6b51f1708089850a9ea78dd273ccd1385bae0a1e4f76c214ab47f81e5aa40314"} Mar 08 03:18:06.362879 master-0 kubenswrapper[13046]: I0308 03:18:06.362812 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerStarted","Data":"f1b9217db8d5e19f2cb58c10ee2d605393c537b6451319c583b6c04aabf8378e"} Mar 08 03:18:06.367235 master-0 kubenswrapper[13046]: I0308 03:18:06.367202 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/1.log" Mar 08 03:18:06.369236 master-0 kubenswrapper[13046]: I0308 03:18:06.368523 13046 generic.go:334] "Generic (PLEG): container finished" podID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" exitCode=1 Mar 08 03:18:06.369236 master-0 kubenswrapper[13046]: I0308 03:18:06.368560 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerDied","Data":"8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e"} Mar 08 03:18:06.369236 master-0 kubenswrapper[13046]: I0308 03:18:06.368904 13046 scope.go:117] "RemoveContainer" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" Mar 08 03:18:06.369236 master-0 kubenswrapper[13046]: E0308 03:18:06.369045 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:06.382645 master-0 kubenswrapper[13046]: I0308 03:18:06.382593 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerStarted","Data":"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244"} Mar 08 03:18:06.414700 master-0 kubenswrapper[13046]: I0308 03:18:06.414646 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerStarted","Data":"f11ee247d04d7bc03d237f2769fc0091211f8dc40e889a0542f0508aea7898b5"} Mar 08 03:18:06.414700 master-0 kubenswrapper[13046]: I0308 03:18:06.414693 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerStarted","Data":"519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d"} Mar 08 03:18:06.414700 master-0 kubenswrapper[13046]: I0308 03:18:06.414704 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerStarted","Data":"71b99594a28a0ded9c84cbe541990e78ed4295d37bb5353fd4aa2f5819a522ca"} Mar 08 03:18:06.424943 master-0 kubenswrapper[13046]: I0308 03:18:06.424907 13046 scope.go:117] "RemoveContainer" containerID="df3e8baabefc90e04c02f0f45ed7aa89841f1f4954012b9c683b090559c5e516" Mar 08 03:18:06.425786 master-0 kubenswrapper[13046]: I0308 03:18:06.425748 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" event={"ID":"33c15b06-a21e-411f-b324-3ae0c7f0e9a4","Type":"ContainerStarted","Data":"4d242cfc055eb46b941ca9d0c998ddc24eee95b4f21d872b656d905f44eb102a"} Mar 08 03:18:06.425786 master-0 kubenswrapper[13046]: I0308 03:18:06.425783 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-lhvrm" event={"ID":"33c15b06-a21e-411f-b324-3ae0c7f0e9a4","Type":"ContainerStarted","Data":"809b44345a8a6f8ea8a4803c15aad767485a94d34c9b51dd0fddfebc1ac0e9b0"} Mar 08 03:18:06.446459 master-0 kubenswrapper[13046]: I0308 03:18:06.446430 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/1.log" Mar 08 03:18:06.460435 master-0 kubenswrapper[13046]: I0308 03:18:06.460400 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/0.log" Mar 08 03:18:06.460983 master-0 kubenswrapper[13046]: I0308 03:18:06.460927 13046 generic.go:334] "Generic (PLEG): container finished" podID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" exitCode=1 Mar 08 03:18:06.460983 master-0 kubenswrapper[13046]: I0308 03:18:06.460976 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerDied","Data":"3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628"} Mar 08 03:18:06.461073 master-0 kubenswrapper[13046]: I0308 03:18:06.461002 13046 scope.go:117] "RemoveContainer" containerID="95cb1ab0414f6248676ceab0da8402d36a93f6fced2ddcec794373deb0d0db80" Mar 08 03:18:06.461578 master-0 kubenswrapper[13046]: I0308 03:18:06.461437 13046 scope.go:117] "RemoveContainer" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" Mar 08 03:18:06.461658 master-0 kubenswrapper[13046]: E0308 03:18:06.461602 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:06.473783 master-0 kubenswrapper[13046]: I0308 03:18:06.473742 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/0.log" Mar 08 03:18:06.473869 master-0 kubenswrapper[13046]: I0308 03:18:06.473831 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerStarted","Data":"3eb78859836f3da919c6f295f9ebd383a5b6b693cee1d6fd99889820e2d9696c"} Mar 08 03:18:06.478739 master-0 kubenswrapper[13046]: I0308 03:18:06.478704 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-98n6d" event={"ID":"982ea338-c7be-4776-9bb7-113834c54aaa","Type":"ContainerStarted","Data":"d606d67219603232ba164132208bf6a30f7f39672e1a55182e5dd4a34a171289"} Mar 08 03:18:06.486264 master-0 kubenswrapper[13046]: I0308 03:18:06.486226 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/0.log" Mar 08 03:18:06.486398 master-0 kubenswrapper[13046]: I0308 03:18:06.486292 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerStarted","Data":"af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09"} Mar 08 03:18:06.488471 master-0 kubenswrapper[13046]: I0308 03:18:06.488439 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b"} Mar 08 03:18:06.499145 master-0 kubenswrapper[13046]: I0308 03:18:06.499084 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/1.log" Mar 08 03:18:06.505912 master-0 kubenswrapper[13046]: I0308 03:18:06.505874 13046 scope.go:117] "RemoveContainer" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" Mar 08 03:18:06.506089 master-0 kubenswrapper[13046]: E0308 03:18:06.506018 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:06.512586 master-0 kubenswrapper[13046]: I0308 03:18:06.512208 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=295.51219623 podStartE2EDuration="4m55.51219623s" podCreationTimestamp="2026-03-08 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:18:06.510133886 +0000 UTC m=+288.588901103" watchObservedRunningTime="2026-03-08 03:18:06.51219623 +0000 UTC m=+288.590963437" Mar 08 03:18:06.749389 master-0 kubenswrapper[13046]: I0308 03:18:06.749295 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=295.749276497 podStartE2EDuration="4m55.749276497s" podCreationTimestamp="2026-03-08 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:18:06.748470726 +0000 UTC m=+288.827237943" watchObservedRunningTime="2026-03-08 03:18:06.749276497 +0000 UTC m=+288.828043714" Mar 08 03:18:06.902013 master-0 kubenswrapper[13046]: I0308 03:18:06.901935 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:18:06.905327 master-0 kubenswrapper[13046]: I0308 03:18:06.905245 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nggbb"] Mar 08 03:18:07.175102 master-0 kubenswrapper[13046]: I0308 03:18:07.175022 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:07.175339 master-0 kubenswrapper[13046]: I0308 03:18:07.175108 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:07.515658 master-0 kubenswrapper[13046]: I0308 03:18:07.515498 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnlct" event={"ID":"50ab8f71-42b8-4967-8a0b-016647c59a37","Type":"ContainerStarted","Data":"762d4404f8cdb92dc06014065b025937b13c230a9e685f12ac6e5e528642baac"} Mar 08 03:18:07.518264 master-0 kubenswrapper[13046]: I0308 03:18:07.518219 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-f85rr" event={"ID":"c31f7cee-d21d-4c23-af9b-1e0180b12e1e","Type":"ContainerStarted","Data":"d2a0b8ed616e026fa45f76d09c5e2b5f878a8f87904059d96a6a98b0874d1449"} Mar 08 03:18:07.523827 master-0 kubenswrapper[13046]: I0308 03:18:07.523737 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/1.log" Mar 08 03:18:07.525886 master-0 kubenswrapper[13046]: I0308 03:18:07.525820 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/0.log" Mar 08 03:18:07.526389 master-0 kubenswrapper[13046]: I0308 03:18:07.526306 13046 generic.go:334] "Generic (PLEG): container finished" podID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" containerID="8233eaa0981cd430ea08d1214c68eb0400a69719755aff5077e292b51d43074c" exitCode=255 Mar 08 03:18:07.526468 master-0 kubenswrapper[13046]: I0308 03:18:07.526398 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerDied","Data":"8233eaa0981cd430ea08d1214c68eb0400a69719755aff5077e292b51d43074c"} Mar 08 03:18:07.526547 master-0 kubenswrapper[13046]: I0308 03:18:07.526516 13046 scope.go:117] "RemoveContainer" containerID="12842d6e85f5044777c9e34c897a5f7d9df61dcd7edc5c7f9e3ce75f2cd269a4" Mar 08 03:18:07.527021 master-0 kubenswrapper[13046]: I0308 03:18:07.526988 13046 scope.go:117] "RemoveContainer" containerID="8233eaa0981cd430ea08d1214c68eb0400a69719755aff5077e292b51d43074c" Mar 08 03:18:07.528145 master-0 kubenswrapper[13046]: E0308 03:18:07.527227 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-cpnw6_openshift-machine-api(17eaab63-9ba9-4a4a-891d-a76aa3f03b46)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" podUID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" Mar 08 03:18:07.532503 master-0 kubenswrapper[13046]: I0308 03:18:07.528997 13046 generic.go:334] "Generic (PLEG): container finished" podID="68309159-130a-4ffa-acec-95dc4b795b8f" containerID="c149b517df2ec45a89c00789c7e4fb7b4ad982330685b93d6467d0ca24a16c74" exitCode=0 Mar 08 03:18:07.532503 master-0 kubenswrapper[13046]: I0308 03:18:07.529098 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm92r" event={"ID":"68309159-130a-4ffa-acec-95dc4b795b8f","Type":"ContainerDied","Data":"c149b517df2ec45a89c00789c7e4fb7b4ad982330685b93d6467d0ca24a16c74"} Mar 08 03:18:07.546255 master-0 kubenswrapper[13046]: I0308 03:18:07.542956 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143"} Mar 08 03:18:07.546255 master-0 kubenswrapper[13046]: I0308 03:18:07.542995 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d"} Mar 08 03:18:07.550634 master-0 kubenswrapper[13046]: I0308 03:18:07.550594 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/1.log" Mar 08 03:18:07.551111 master-0 kubenswrapper[13046]: I0308 03:18:07.551080 13046 scope.go:117] "RemoveContainer" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" Mar 08 03:18:07.551298 master-0 kubenswrapper[13046]: E0308 03:18:07.551264 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:07.553911 master-0 kubenswrapper[13046]: I0308 03:18:07.553880 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/1.log" Mar 08 03:18:07.554587 master-0 kubenswrapper[13046]: I0308 03:18:07.554555 13046 scope.go:117] "RemoveContainer" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" Mar 08 03:18:07.554843 master-0 kubenswrapper[13046]: E0308 03:18:07.554812 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:07.558007 master-0 kubenswrapper[13046]: I0308 03:18:07.557925 13046 generic.go:334] "Generic (PLEG): container finished" podID="6a9d0240-fc00-4d78-9458-8f53b1876f1b" containerID="b2832b9d90414c1e17ab12a22a44053807303c51d15dcbe866e2ce8b6dfacc13" exitCode=0 Mar 08 03:18:07.558007 master-0 kubenswrapper[13046]: I0308 03:18:07.557991 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm8fd" event={"ID":"6a9d0240-fc00-4d78-9458-8f53b1876f1b","Type":"ContainerDied","Data":"b2832b9d90414c1e17ab12a22a44053807303c51d15dcbe866e2ce8b6dfacc13"} Mar 08 03:18:07.569183 master-0 kubenswrapper[13046]: I0308 03:18:07.569119 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qh2" event={"ID":"9cf6ce1a-c203-4033-86be-be16694a9062","Type":"ContainerStarted","Data":"dd2efeeac4485dbe07c8d83f50b1eb2884c49c43fff88ec87396426a60228f6c"} Mar 08 03:18:07.571447 master-0 kubenswrapper[13046]: I0308 03:18:07.571420 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/1.log" Mar 08 03:18:07.572425 master-0 kubenswrapper[13046]: I0308 03:18:07.572384 13046 scope.go:117] "RemoveContainer" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" Mar 08 03:18:07.572671 master-0 kubenswrapper[13046]: E0308 03:18:07.572633 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:07.576127 master-0 kubenswrapper[13046]: I0308 03:18:07.576078 13046 generic.go:334] "Generic (PLEG): container finished" podID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" containerID="66fd7431848d65473a6b8fb616b87b5af0bb1ee721d95e6ba6f2b96ce2dd0308" exitCode=0 Mar 08 03:18:07.576259 master-0 kubenswrapper[13046]: I0308 03:18:07.576213 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerDied","Data":"66fd7431848d65473a6b8fb616b87b5af0bb1ee721d95e6ba6f2b96ce2dd0308"} Mar 08 03:18:07.576297 master-0 kubenswrapper[13046]: I0308 03:18:07.576281 13046 scope.go:117] "RemoveContainer" containerID="59a094af1d0fb4c0580e000ba9579d4683433cc1042d1d4763a483f3bf1e9302" Mar 08 03:18:07.576791 master-0 kubenswrapper[13046]: I0308 03:18:07.576748 13046 scope.go:117] "RemoveContainer" containerID="66fd7431848d65473a6b8fb616b87b5af0bb1ee721d95e6ba6f2b96ce2dd0308" Mar 08 03:18:07.577028 master-0 kubenswrapper[13046]: E0308 03:18:07.576987 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:18:08.126818 master-0 kubenswrapper[13046]: I0308 03:18:08.126763 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6e3f01-0f22-4961-b450-56aca5477943" path="/var/lib/kubelet/pods/1a6e3f01-0f22-4961-b450-56aca5477943/volumes" Mar 08 03:18:08.127270 master-0 kubenswrapper[13046]: I0308 03:18:08.127238 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81d3c37-e8d7-44c8-973e-13992380ce85" path="/var/lib/kubelet/pods/e81d3c37-e8d7-44c8-973e-13992380ce85/volumes" Mar 08 03:18:08.579065 master-0 kubenswrapper[13046]: I0308 03:18:08.578559 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:08.579065 master-0 kubenswrapper[13046]: I0308 03:18:08.578623 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:08.595514 master-0 kubenswrapper[13046]: I0308 03:18:08.592441 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm92r" event={"ID":"68309159-130a-4ffa-acec-95dc4b795b8f","Type":"ContainerStarted","Data":"bdf4f0d861a0b0c26171d87b74b8c01623d0ca3a6b0c9a77de79097b4498f4c2"} Mar 08 03:18:08.600071 master-0 kubenswrapper[13046]: I0308 03:18:08.599965 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/1.log" Mar 08 03:18:08.602797 master-0 kubenswrapper[13046]: I0308 03:18:08.602747 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm8fd" event={"ID":"6a9d0240-fc00-4d78-9458-8f53b1876f1b","Type":"ContainerStarted","Data":"5bc379c3ed5aeb3bdf0c68d4fbbed5d881ef93b73b0dc2403b5e43aae1fe80e1"} Mar 08 03:18:09.610545 master-0 kubenswrapper[13046]: I0308 03:18:09.610393 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-fb55b5d5d-pm69n" Mar 08 03:18:09.906475 master-0 kubenswrapper[13046]: I0308 03:18:09.906354 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-778796f487-vzb5n" Mar 08 03:18:11.284379 master-0 kubenswrapper[13046]: I0308 03:18:11.284310 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:11.285023 master-0 kubenswrapper[13046]: I0308 03:18:11.284396 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:11.285761 master-0 kubenswrapper[13046]: I0308 03:18:11.285712 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:11.285826 master-0 kubenswrapper[13046]: I0308 03:18:11.285785 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:13.674605 master-0 kubenswrapper[13046]: I0308 03:18:13.674541 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:13.675179 master-0 kubenswrapper[13046]: I0308 03:18:13.674998 13046 scope.go:117] "RemoveContainer" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" Mar 08 03:18:13.675237 master-0 kubenswrapper[13046]: E0308 03:18:13.675179 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:13.725420 master-0 kubenswrapper[13046]: I0308 03:18:13.725393 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:13.726217 master-0 kubenswrapper[13046]: I0308 03:18:13.725849 13046 scope.go:117] "RemoveContainer" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" Mar 08 03:18:13.726670 master-0 kubenswrapper[13046]: I0308 03:18:13.726616 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:13.727004 master-0 kubenswrapper[13046]: E0308 03:18:13.726962 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:13.998189 master-0 kubenswrapper[13046]: I0308 03:18:13.997980 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:13.998189 master-0 kubenswrapper[13046]: I0308 03:18:13.998039 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:14.007919 master-0 kubenswrapper[13046]: I0308 03:18:14.007872 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:14.008748 master-0 kubenswrapper[13046]: I0308 03:18:14.008715 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:14.052163 master-0 kubenswrapper[13046]: I0308 03:18:14.052102 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:14.291831 master-0 kubenswrapper[13046]: I0308 03:18:14.291724 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:14.292288 master-0 kubenswrapper[13046]: I0308 03:18:14.292262 13046 scope.go:117] "RemoveContainer" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" Mar 08 03:18:14.292503 master-0 kubenswrapper[13046]: E0308 03:18:14.292460 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:14.297908 master-0 kubenswrapper[13046]: I0308 03:18:14.297857 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:14.297996 master-0 kubenswrapper[13046]: I0308 03:18:14.297949 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:14.298037 master-0 kubenswrapper[13046]: I0308 03:18:14.298008 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:14.298075 master-0 kubenswrapper[13046]: I0308 03:18:14.298059 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:14.298121 master-0 kubenswrapper[13046]: I0308 03:18:14.298090 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:14.299093 master-0 kubenswrapper[13046]: I0308 03:18:14.299048 13046 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 08 03:18:14.299156 master-0 kubenswrapper[13046]: I0308 03:18:14.299126 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" containerID="cri-o://1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d" gracePeriod=30 Mar 08 03:18:14.308781 master-0 kubenswrapper[13046]: I0308 03:18:14.308734 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:14.308997 master-0 kubenswrapper[13046]: I0308 03:18:14.308845 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:14.312874 master-0 kubenswrapper[13046]: I0308 03:18:14.311517 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": read tcp 10.128.0.2:56006->10.128.0.12:8443: read: connection reset by peer" start-of-body= Mar 08 03:18:14.312874 master-0 kubenswrapper[13046]: I0308 03:18:14.311555 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": read tcp 10.128.0.2:56006->10.128.0.12:8443: read: connection reset by peer" Mar 08 03:18:14.367890 master-0 kubenswrapper[13046]: I0308 03:18:14.367851 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:14.592310 master-0 kubenswrapper[13046]: I0308 03:18:14.592247 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:14.593111 master-0 kubenswrapper[13046]: I0308 03:18:14.593074 13046 scope.go:117] "RemoveContainer" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" Mar 08 03:18:14.593351 master-0 kubenswrapper[13046]: E0308 03:18:14.593316 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:14.676782 master-0 kubenswrapper[13046]: I0308 03:18:14.676725 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/1.log" Mar 08 03:18:14.686707 master-0 kubenswrapper[13046]: I0308 03:18:14.677557 13046 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d" exitCode=255 Mar 08 03:18:14.686707 master-0 kubenswrapper[13046]: I0308 03:18:14.677864 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerDied","Data":"1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d"} Mar 08 03:18:14.686707 master-0 kubenswrapper[13046]: I0308 03:18:14.677911 13046 scope.go:117] "RemoveContainer" containerID="873a972e72df02e333cdd5be8d4415642ae4a31a8ef844e8221962cd437b0309" Mar 08 03:18:14.686707 master-0 kubenswrapper[13046]: I0308 03:18:14.678578 13046 scope.go:117] "RemoveContainer" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" Mar 08 03:18:14.731821 master-0 kubenswrapper[13046]: I0308 03:18:14.731758 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jnlct" Mar 08 03:18:14.741141 master-0 kubenswrapper[13046]: I0308 03:18:14.741104 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5qh2" Mar 08 03:18:14.907411 master-0 kubenswrapper[13046]: I0308 03:18:14.907352 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:14.907411 master-0 kubenswrapper[13046]: I0308 03:18:14.907418 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:14.951965 master-0 kubenswrapper[13046]: I0308 03:18:14.951912 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:15.053945 master-0 kubenswrapper[13046]: I0308 03:18:15.053803 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zm8fd" podUID="6a9d0240-fc00-4d78-9458-8f53b1876f1b" containerName="registry-server" probeResult="failure" output=< Mar 08 03:18:15.053945 master-0 kubenswrapper[13046]: timeout: failed to connect service ":50051" within 1s Mar 08 03:18:15.053945 master-0 kubenswrapper[13046]: > Mar 08 03:18:15.106627 master-0 kubenswrapper[13046]: I0308 03:18:15.106555 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:18:15.106859 master-0 kubenswrapper[13046]: I0308 03:18:15.106807 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0" gracePeriod=5 Mar 08 03:18:15.118707 master-0 kubenswrapper[13046]: I0308 03:18:15.118653 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:15.118894 master-0 kubenswrapper[13046]: E0308 03:18:15.118872 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:15.626818 master-0 kubenswrapper[13046]: I0308 03:18:15.626732 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:15.626818 master-0 kubenswrapper[13046]: I0308 03:18:15.626795 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:15.688273 master-0 kubenswrapper[13046]: I0308 03:18:15.688016 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/1.log" Mar 08 03:18:15.689729 master-0 kubenswrapper[13046]: I0308 03:18:15.689653 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerStarted","Data":"722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f"} Mar 08 03:18:15.690084 master-0 kubenswrapper[13046]: I0308 03:18:15.690030 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:15.693341 master-0 kubenswrapper[13046]: I0308 03:18:15.693290 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/2.log" Mar 08 03:18:15.694300 master-0 kubenswrapper[13046]: I0308 03:18:15.694263 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/1.log" Mar 08 03:18:15.695170 master-0 kubenswrapper[13046]: I0308 03:18:15.695090 13046 generic.go:334] "Generic (PLEG): container finished" podID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" containerID="165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d" exitCode=1 Mar 08 03:18:15.695421 master-0 kubenswrapper[13046]: I0308 03:18:15.695324 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerDied","Data":"165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d"} Mar 08 03:18:15.695539 master-0 kubenswrapper[13046]: I0308 03:18:15.695462 13046 scope.go:117] "RemoveContainer" containerID="01b0c9283601294ac91726ca827dcedce1733800cdf20df312fa53a7adcb6145" Mar 08 03:18:15.696440 master-0 kubenswrapper[13046]: I0308 03:18:15.696279 13046 scope.go:117] "RemoveContainer" containerID="165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d" Mar 08 03:18:15.696616 master-0 kubenswrapper[13046]: E0308 03:18:15.696561 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:15.705400 master-0 kubenswrapper[13046]: I0308 03:18:15.705329 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-qt654" event={"ID":"c3729e29-4c57-4f9b-8202-a87fd3a9a722","Type":"ContainerStarted","Data":"8f1b53a960af1cf03696286470a67b15e8be5f0494374d94e2d2d71f44768550"} Mar 08 03:18:15.773066 master-0 kubenswrapper[13046]: I0308 03:18:15.772991 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zm92r" Mar 08 03:18:16.716136 master-0 kubenswrapper[13046]: I0308 03:18:16.716060 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/2.log" Mar 08 03:18:18.149977 master-0 kubenswrapper[13046]: I0308 03:18:18.149894 13046 scope.go:117] "RemoveContainer" containerID="6f47524787fe6d12f2f00918cc138535f7c801d780aa325200500bc9264d2c6c" Mar 08 03:18:18.711527 master-0 kubenswrapper[13046]: E0308 03:18:18.711438 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff: no such file or directory, extraDiskErr: Mar 08 03:18:19.118781 master-0 kubenswrapper[13046]: I0308 03:18:19.118741 13046 scope.go:117] "RemoveContainer" containerID="a2120db451b5376796065b553810bc8500ecbfec6f711b80ba3aef0fdc6a5c29" Mar 08 03:18:19.747066 master-0 kubenswrapper[13046]: I0308 03:18:19.746918 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/2.log" Mar 08 03:18:19.747790 master-0 kubenswrapper[13046]: I0308 03:18:19.747738 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/1.log" Mar 08 03:18:19.748341 master-0 kubenswrapper[13046]: I0308 03:18:19.748274 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" containerID="6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7" exitCode=1 Mar 08 03:18:19.748432 master-0 kubenswrapper[13046]: I0308 03:18:19.748349 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerDied","Data":"6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7"} Mar 08 03:18:19.748432 master-0 kubenswrapper[13046]: I0308 03:18:19.748413 13046 scope.go:117] "RemoveContainer" containerID="a2120db451b5376796065b553810bc8500ecbfec6f711b80ba3aef0fdc6a5c29" Mar 08 03:18:19.749262 master-0 kubenswrapper[13046]: I0308 03:18:19.749157 13046 scope.go:117] "RemoveContainer" containerID="6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7" Mar 08 03:18:19.749684 master-0 kubenswrapper[13046]: E0308 03:18:19.749634 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:18:20.285447 master-0 kubenswrapper[13046]: I0308 03:18:20.285376 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:20.285447 master-0 kubenswrapper[13046]: I0308 03:18:20.285436 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:20.285834 master-0 kubenswrapper[13046]: I0308 03:18:20.285387 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:20.285834 master-0 kubenswrapper[13046]: I0308 03:18:20.285584 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:20.712373 master-0 kubenswrapper[13046]: I0308 03:18:20.712274 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 03:18:20.712636 master-0 kubenswrapper[13046]: I0308 03:18:20.712390 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:18:20.759109 master-0 kubenswrapper[13046]: I0308 03:18:20.759040 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 08 03:18:20.760007 master-0 kubenswrapper[13046]: I0308 03:18:20.759108 13046 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0" exitCode=137 Mar 08 03:18:20.760007 master-0 kubenswrapper[13046]: I0308 03:18:20.759207 13046 scope.go:117] "RemoveContainer" containerID="33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0" Mar 08 03:18:20.760007 master-0 kubenswrapper[13046]: I0308 03:18:20.759237 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:18:20.762150 master-0 kubenswrapper[13046]: I0308 03:18:20.762071 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/2.log" Mar 08 03:18:20.782435 master-0 kubenswrapper[13046]: I0308 03:18:20.782374 13046 scope.go:117] "RemoveContainer" containerID="33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0" Mar 08 03:18:20.783093 master-0 kubenswrapper[13046]: E0308 03:18:20.783010 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0\": container with ID starting with 33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0 not found: ID does not exist" containerID="33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0" Mar 08 03:18:20.783232 master-0 kubenswrapper[13046]: I0308 03:18:20.783096 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0"} err="failed to get container status \"33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0\": rpc error: code = NotFound desc = could not find container \"33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0\": container with ID starting with 33a24c18b6a9b6dc40814ff207f108a5df8aa1f49aec75c8f41041a5d542fed0 not found: ID does not exist" Mar 08 03:18:20.788520 master-0 kubenswrapper[13046]: I0308 03:18:20.788427 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:18:20.788520 master-0 kubenswrapper[13046]: I0308 03:18:20.788471 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:18:20.788716 master-0 kubenswrapper[13046]: I0308 03:18:20.788618 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:18:20.788716 master-0 kubenswrapper[13046]: I0308 03:18:20.788661 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:18:20.788716 master-0 kubenswrapper[13046]: I0308 03:18:20.788683 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 08 03:18:20.788909 master-0 kubenswrapper[13046]: I0308 03:18:20.788758 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:20.788909 master-0 kubenswrapper[13046]: I0308 03:18:20.788786 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:20.788909 master-0 kubenswrapper[13046]: I0308 03:18:20.788873 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:20.789126 master-0 kubenswrapper[13046]: I0308 03:18:20.788769 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:20.789126 master-0 kubenswrapper[13046]: I0308 03:18:20.789008 13046 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:20.789126 master-0 kubenswrapper[13046]: I0308 03:18:20.789030 13046 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:20.795719 master-0 kubenswrapper[13046]: I0308 03:18:20.795642 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:18:20.890347 master-0 kubenswrapper[13046]: I0308 03:18:20.890271 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:20.890347 master-0 kubenswrapper[13046]: I0308 03:18:20.890322 13046 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:20.890347 master-0 kubenswrapper[13046]: I0308 03:18:20.890342 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:18:21.118277 master-0 kubenswrapper[13046]: I0308 03:18:21.118202 13046 scope.go:117] "RemoveContainer" containerID="66fd7431848d65473a6b8fb616b87b5af0bb1ee721d95e6ba6f2b96ce2dd0308" Mar 08 03:18:21.772909 master-0 kubenswrapper[13046]: I0308 03:18:21.772876 13046 generic.go:334] "Generic (PLEG): container finished" podID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" containerID="333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12" exitCode=0 Mar 08 03:18:21.773373 master-0 kubenswrapper[13046]: I0308 03:18:21.773000 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerDied","Data":"333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12"} Mar 08 03:18:21.773460 master-0 kubenswrapper[13046]: I0308 03:18:21.773448 13046 scope.go:117] "RemoveContainer" containerID="66fd7431848d65473a6b8fb616b87b5af0bb1ee721d95e6ba6f2b96ce2dd0308" Mar 08 03:18:21.775160 master-0 kubenswrapper[13046]: I0308 03:18:21.775074 13046 scope.go:117] "RemoveContainer" containerID="333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12" Mar 08 03:18:21.777212 master-0 kubenswrapper[13046]: E0308 03:18:21.777156 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:18:22.118935 master-0 kubenswrapper[13046]: I0308 03:18:22.118886 13046 scope.go:117] "RemoveContainer" containerID="8233eaa0981cd430ea08d1214c68eb0400a69719755aff5077e292b51d43074c" Mar 08 03:18:22.146353 master-0 kubenswrapper[13046]: I0308 03:18:22.146268 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 08 03:18:22.146974 master-0 kubenswrapper[13046]: I0308 03:18:22.146919 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 08 03:18:22.166072 master-0 kubenswrapper[13046]: I0308 03:18:22.165868 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:18:22.166072 master-0 kubenswrapper[13046]: I0308 03:18:22.166036 13046 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="14c23443-0fa6-4028-8f61-09c3b7a2e49a" Mar 08 03:18:22.174624 master-0 kubenswrapper[13046]: I0308 03:18:22.174564 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:18:22.174781 master-0 kubenswrapper[13046]: I0308 03:18:22.174630 13046 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="14c23443-0fa6-4028-8f61-09c3b7a2e49a" Mar 08 03:18:22.784355 master-0 kubenswrapper[13046]: I0308 03:18:22.784184 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/2.log" Mar 08 03:18:22.785210 master-0 kubenswrapper[13046]: I0308 03:18:22.784923 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/1.log" Mar 08 03:18:22.785533 master-0 kubenswrapper[13046]: I0308 03:18:22.785442 13046 generic.go:334] "Generic (PLEG): container finished" podID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" containerID="a9d2b3dad7c5e8267e04a16754013d54c3238e5662f29d7e4e345f61b520e85d" exitCode=255 Mar 08 03:18:22.785626 master-0 kubenswrapper[13046]: I0308 03:18:22.785568 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerDied","Data":"a9d2b3dad7c5e8267e04a16754013d54c3238e5662f29d7e4e345f61b520e85d"} Mar 08 03:18:22.785866 master-0 kubenswrapper[13046]: I0308 03:18:22.785623 13046 scope.go:117] "RemoveContainer" containerID="8233eaa0981cd430ea08d1214c68eb0400a69719755aff5077e292b51d43074c" Mar 08 03:18:22.786368 master-0 kubenswrapper[13046]: I0308 03:18:22.786316 13046 scope.go:117] "RemoveContainer" containerID="a9d2b3dad7c5e8267e04a16754013d54c3238e5662f29d7e4e345f61b520e85d" Mar 08 03:18:22.786719 master-0 kubenswrapper[13046]: E0308 03:18:22.786658 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-cpnw6_openshift-machine-api(17eaab63-9ba9-4a4a-891d-a76aa3f03b46)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" podUID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" Mar 08 03:18:23.285569 master-0 kubenswrapper[13046]: I0308 03:18:23.285461 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:23.285569 master-0 kubenswrapper[13046]: I0308 03:18:23.285587 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:23.286938 master-0 kubenswrapper[13046]: I0308 03:18:23.285857 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:23.286938 master-0 kubenswrapper[13046]: I0308 03:18:23.285971 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:23.680756 master-0 kubenswrapper[13046]: I0308 03:18:23.680682 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:23.680756 master-0 kubenswrapper[13046]: I0308 03:18:23.680762 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:23.681637 master-0 kubenswrapper[13046]: I0308 03:18:23.681592 13046 scope.go:117] "RemoveContainer" containerID="165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d" Mar 08 03:18:23.681948 master-0 kubenswrapper[13046]: E0308 03:18:23.681887 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:23.800366 master-0 kubenswrapper[13046]: I0308 03:18:23.800274 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/2.log" Mar 08 03:18:24.046622 master-0 kubenswrapper[13046]: I0308 03:18:24.046568 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:24.083223 master-0 kubenswrapper[13046]: I0308 03:18:24.083177 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zm8fd" Mar 08 03:18:24.291163 master-0 kubenswrapper[13046]: I0308 03:18:24.290982 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:24.291841 master-0 kubenswrapper[13046]: I0308 03:18:24.291791 13046 scope.go:117] "RemoveContainer" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" Mar 08 03:18:24.592603 master-0 kubenswrapper[13046]: I0308 03:18:24.592533 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:24.593406 master-0 kubenswrapper[13046]: I0308 03:18:24.593366 13046 scope.go:117] "RemoveContainer" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" Mar 08 03:18:24.809220 master-0 kubenswrapper[13046]: I0308 03:18:24.809185 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/2.log" Mar 08 03:18:24.811780 master-0 kubenswrapper[13046]: I0308 03:18:24.809956 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/1.log" Mar 08 03:18:24.811780 master-0 kubenswrapper[13046]: I0308 03:18:24.810452 13046 generic.go:334] "Generic (PLEG): container finished" podID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" containerID="db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824" exitCode=1 Mar 08 03:18:24.811780 master-0 kubenswrapper[13046]: I0308 03:18:24.810520 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerDied","Data":"db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824"} Mar 08 03:18:24.811780 master-0 kubenswrapper[13046]: I0308 03:18:24.810596 13046 scope.go:117] "RemoveContainer" containerID="6ad207f364981bf05fab25a7da18e32de5d755db1dc5228665c06ce6ed001108" Mar 08 03:18:24.811780 master-0 kubenswrapper[13046]: I0308 03:18:24.811605 13046 scope.go:117] "RemoveContainer" containerID="db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824" Mar 08 03:18:24.812216 master-0 kubenswrapper[13046]: E0308 03:18:24.812003 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:25.627780 master-0 kubenswrapper[13046]: I0308 03:18:25.627675 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:25.628033 master-0 kubenswrapper[13046]: I0308 03:18:25.627773 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:25.821900 master-0 kubenswrapper[13046]: I0308 03:18:25.821804 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/2.log" Mar 08 03:18:25.822839 master-0 kubenswrapper[13046]: I0308 03:18:25.822641 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/1.log" Mar 08 03:18:25.823280 master-0 kubenswrapper[13046]: I0308 03:18:25.823208 13046 generic.go:334] "Generic (PLEG): container finished" podID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" containerID="3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604" exitCode=1 Mar 08 03:18:25.823409 master-0 kubenswrapper[13046]: I0308 03:18:25.823323 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerDied","Data":"3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604"} Mar 08 03:18:25.823552 master-0 kubenswrapper[13046]: I0308 03:18:25.823422 13046 scope.go:117] "RemoveContainer" containerID="3d85acd9c0f173c5a8fe1a2f6362557217678c6d09d0fa6f7317711af00b0628" Mar 08 03:18:25.824393 master-0 kubenswrapper[13046]: I0308 03:18:25.824170 13046 scope.go:117] "RemoveContainer" containerID="3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604" Mar 08 03:18:25.824601 master-0 kubenswrapper[13046]: E0308 03:18:25.824552 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:25.826079 master-0 kubenswrapper[13046]: I0308 03:18:25.825889 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/2.log" Mar 08 03:18:26.286232 master-0 kubenswrapper[13046]: I0308 03:18:26.286162 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.286270 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.286289 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.286386 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.286455 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.287296 13046 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 08 03:18:26.287441 master-0 kubenswrapper[13046]: I0308 03:18:26.287354 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" containerID="cri-o://722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f" gracePeriod=30 Mar 08 03:18:26.304032 master-0 kubenswrapper[13046]: I0308 03:18:26.303944 13046 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-zg4zr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": read tcp 10.128.0.2:41512->10.128.0.12:8443: read: connection reset by peer" start-of-body= Mar 08 03:18:26.304284 master-0 kubenswrapper[13046]: I0308 03:18:26.304042 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": read tcp 10.128.0.2:41512->10.128.0.12:8443: read: connection reset by peer" Mar 08 03:18:26.410947 master-0 kubenswrapper[13046]: E0308 03:18:26.410875 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-64488f9d78-zg4zr_openshift-config-operator(3bf93333-b537-4f23-9c77-6a245b290fe3)\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" Mar 08 03:18:26.838360 master-0 kubenswrapper[13046]: I0308 03:18:26.838287 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/2.log" Mar 08 03:18:26.842063 master-0 kubenswrapper[13046]: I0308 03:18:26.841908 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/2.log" Mar 08 03:18:26.842864 master-0 kubenswrapper[13046]: I0308 03:18:26.842801 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/1.log" Mar 08 03:18:26.843784 master-0 kubenswrapper[13046]: I0308 03:18:26.843736 13046 generic.go:334] "Generic (PLEG): container finished" podID="3bf93333-b537-4f23-9c77-6a245b290fe3" containerID="722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f" exitCode=255 Mar 08 03:18:26.843958 master-0 kubenswrapper[13046]: I0308 03:18:26.843794 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerDied","Data":"722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f"} Mar 08 03:18:26.843958 master-0 kubenswrapper[13046]: I0308 03:18:26.843845 13046 scope.go:117] "RemoveContainer" containerID="1028ff741dd20a61a2f7b9e42cfcdd892e469b9feb8dd96c4b2001af9418f66d" Mar 08 03:18:26.845153 master-0 kubenswrapper[13046]: I0308 03:18:26.844577 13046 scope.go:117] "RemoveContainer" containerID="722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f" Mar 08 03:18:26.846212 master-0 kubenswrapper[13046]: E0308 03:18:26.846158 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-64488f9d78-zg4zr_openshift-config-operator(3bf93333-b537-4f23-9c77-6a245b290fe3)\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" Mar 08 03:18:27.861611 master-0 kubenswrapper[13046]: I0308 03:18:27.861508 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/2.log" Mar 08 03:18:28.126159 master-0 kubenswrapper[13046]: I0308 03:18:28.125973 13046 scope.go:117] "RemoveContainer" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" Mar 08 03:18:28.873976 master-0 kubenswrapper[13046]: I0308 03:18:28.873901 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/2.log" Mar 08 03:18:28.875164 master-0 kubenswrapper[13046]: I0308 03:18:28.874744 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/1.log" Mar 08 03:18:28.875164 master-0 kubenswrapper[13046]: I0308 03:18:28.874826 13046 generic.go:334] "Generic (PLEG): container finished" podID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" exitCode=1 Mar 08 03:18:28.875164 master-0 kubenswrapper[13046]: I0308 03:18:28.874939 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerDied","Data":"db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4"} Mar 08 03:18:28.875164 master-0 kubenswrapper[13046]: I0308 03:18:28.875084 13046 scope.go:117] "RemoveContainer" containerID="8eb0735492dc7e156ae75bb89f6740443769a3abf3b637842cf63892405ffa9e" Mar 08 03:18:28.876075 master-0 kubenswrapper[13046]: I0308 03:18:28.875927 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:18:28.876445 master-0 kubenswrapper[13046]: E0308 03:18:28.876383 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:29.118951 master-0 kubenswrapper[13046]: I0308 03:18:29.118848 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:29.119364 master-0 kubenswrapper[13046]: E0308 03:18:29.119298 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:29.887952 master-0 kubenswrapper[13046]: I0308 03:18:29.887754 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/2.log" Mar 08 03:18:31.228010 master-0 kubenswrapper[13046]: I0308 03:18:31.227889 13046 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:18:32.118667 master-0 kubenswrapper[13046]: I0308 03:18:32.118590 13046 scope.go:117] "RemoveContainer" containerID="6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7" Mar 08 03:18:32.119126 master-0 kubenswrapper[13046]: E0308 03:18:32.119064 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:18:33.673905 master-0 kubenswrapper[13046]: I0308 03:18:33.673810 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:33.674927 master-0 kubenswrapper[13046]: I0308 03:18:33.674009 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:18:33.674927 master-0 kubenswrapper[13046]: I0308 03:18:33.674571 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:18:33.675162 master-0 kubenswrapper[13046]: E0308 03:18:33.674943 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:33.918588 master-0 kubenswrapper[13046]: I0308 03:18:33.918515 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:18:33.918866 master-0 kubenswrapper[13046]: E0308 03:18:33.918691 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:34.119034 master-0 kubenswrapper[13046]: I0308 03:18:34.118952 13046 scope.go:117] "RemoveContainer" containerID="333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12" Mar 08 03:18:34.119800 master-0 kubenswrapper[13046]: E0308 03:18:34.119276 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:18:34.291799 master-0 kubenswrapper[13046]: I0308 03:18:34.291721 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:34.292687 master-0 kubenswrapper[13046]: I0308 03:18:34.292600 13046 scope.go:117] "RemoveContainer" containerID="db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824" Mar 08 03:18:34.293009 master-0 kubenswrapper[13046]: E0308 03:18:34.292970 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:34.592580 master-0 kubenswrapper[13046]: I0308 03:18:34.592437 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:34.593409 master-0 kubenswrapper[13046]: I0308 03:18:34.593347 13046 scope.go:117] "RemoveContainer" containerID="3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604" Mar 08 03:18:34.593980 master-0 kubenswrapper[13046]: E0308 03:18:34.593860 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:34.930976 master-0 kubenswrapper[13046]: I0308 03:18:34.930902 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-sjdgk_aadbbe97-2a03-40da-846d-252e29661f67/kube-controller-manager-operator/1.log" Mar 08 03:18:34.932095 master-0 kubenswrapper[13046]: I0308 03:18:34.931998 13046 generic.go:334] "Generic (PLEG): container finished" podID="aadbbe97-2a03-40da-846d-252e29661f67" containerID="35eb39f11e0262a7f614f3efd65f912a40d31232432eefb143835967713072aa" exitCode=255 Mar 08 03:18:34.932229 master-0 kubenswrapper[13046]: I0308 03:18:34.932180 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerDied","Data":"35eb39f11e0262a7f614f3efd65f912a40d31232432eefb143835967713072aa"} Mar 08 03:18:34.932339 master-0 kubenswrapper[13046]: I0308 03:18:34.932255 13046 scope.go:117] "RemoveContainer" containerID="dd690012d95157316ab6ced4002fb7d61efba1abda72967e9e831aa75fc2ce62" Mar 08 03:18:34.933545 master-0 kubenswrapper[13046]: I0308 03:18:34.933434 13046 scope.go:117] "RemoveContainer" containerID="35eb39f11e0262a7f614f3efd65f912a40d31232432eefb143835967713072aa" Mar 08 03:18:34.934347 master-0 kubenswrapper[13046]: E0308 03:18:34.934233 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-86d7cdfdfb-sjdgk_openshift-kube-controller-manager-operator(aadbbe97-2a03-40da-846d-252e29661f67)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" podUID="aadbbe97-2a03-40da-846d-252e29661f67" Mar 08 03:18:34.939183 master-0 kubenswrapper[13046]: I0308 03:18:34.939106 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-hg2f6_3178dfc0-a35e-418e-a954-cd919b8af88c/kube-apiserver-operator/1.log" Mar 08 03:18:34.940881 master-0 kubenswrapper[13046]: I0308 03:18:34.940683 13046 generic.go:334] "Generic (PLEG): container finished" podID="3178dfc0-a35e-418e-a954-cd919b8af88c" containerID="1ab4d8b69b478fc835cd8fd3fc069a281c945a4405e054f4403e41111399ed35" exitCode=255 Mar 08 03:18:34.941042 master-0 kubenswrapper[13046]: I0308 03:18:34.940755 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerDied","Data":"1ab4d8b69b478fc835cd8fd3fc069a281c945a4405e054f4403e41111399ed35"} Mar 08 03:18:34.942185 master-0 kubenswrapper[13046]: I0308 03:18:34.942064 13046 scope.go:117] "RemoveContainer" containerID="1ab4d8b69b478fc835cd8fd3fc069a281c945a4405e054f4403e41111399ed35" Mar 08 03:18:34.942941 master-0 kubenswrapper[13046]: E0308 03:18:34.942857 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-68bd585b-hg2f6_openshift-kube-apiserver-operator(3178dfc0-a35e-418e-a954-cd919b8af88c)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" podUID="3178dfc0-a35e-418e-a954-cd919b8af88c" Mar 08 03:18:34.943946 master-0 kubenswrapper[13046]: I0308 03:18:34.943778 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-gfmq4_8c0192f3-2e60-42c6-9836-c70a9fa407d5/etcd-operator/1.log" Mar 08 03:18:34.944624 master-0 kubenswrapper[13046]: I0308 03:18:34.944560 13046 generic.go:334] "Generic (PLEG): container finished" podID="8c0192f3-2e60-42c6-9836-c70a9fa407d5" containerID="38d50ee052cbf3d522047ab11f11ab5aceb271a88779bde26e4fce4c2a1ce3bf" exitCode=255 Mar 08 03:18:34.944624 master-0 kubenswrapper[13046]: I0308 03:18:34.944620 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerDied","Data":"38d50ee052cbf3d522047ab11f11ab5aceb271a88779bde26e4fce4c2a1ce3bf"} Mar 08 03:18:34.945349 master-0 kubenswrapper[13046]: I0308 03:18:34.945279 13046 scope.go:117] "RemoveContainer" containerID="38d50ee052cbf3d522047ab11f11ab5aceb271a88779bde26e4fce4c2a1ce3bf" Mar 08 03:18:34.945670 master-0 kubenswrapper[13046]: E0308 03:18:34.945614 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-5884b9cd56-gfmq4_openshift-etcd-operator(8c0192f3-2e60-42c6-9836-c70a9fa407d5)\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" podUID="8c0192f3-2e60-42c6-9836-c70a9fa407d5" Mar 08 03:18:34.975578 master-0 kubenswrapper[13046]: I0308 03:18:34.974410 13046 scope.go:117] "RemoveContainer" containerID="a850ed6e7c012bbe7b9c19840009ee86ec09cdada93eef48d0b57da68f29a9e0" Mar 08 03:18:35.005584 master-0 kubenswrapper[13046]: I0308 03:18:35.005516 13046 scope.go:117] "RemoveContainer" containerID="2e008033631dd2c1ad4ec5ade21f6221772733b07ff9c53e1fe52a54485454a9" Mar 08 03:18:35.627309 master-0 kubenswrapper[13046]: I0308 03:18:35.627129 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:18:35.627309 master-0 kubenswrapper[13046]: I0308 03:18:35.627253 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:18:35.952302 master-0 kubenswrapper[13046]: I0308 03:18:35.952237 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/1.log" Mar 08 03:18:35.953064 master-0 kubenswrapper[13046]: I0308 03:18:35.952890 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/0.log" Mar 08 03:18:35.953064 master-0 kubenswrapper[13046]: I0308 03:18:35.952952 13046 generic.go:334] "Generic (PLEG): container finished" podID="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" containerID="af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09" exitCode=1 Mar 08 03:18:35.953064 master-0 kubenswrapper[13046]: I0308 03:18:35.953033 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerDied","Data":"af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09"} Mar 08 03:18:35.953277 master-0 kubenswrapper[13046]: I0308 03:18:35.953086 13046 scope.go:117] "RemoveContainer" containerID="34a41043128393510c095711912036e3de6953d35852c470aeee13ef6010b118" Mar 08 03:18:35.953924 master-0 kubenswrapper[13046]: I0308 03:18:35.953878 13046 scope.go:117] "RemoveContainer" containerID="af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09" Mar 08 03:18:35.954345 master-0 kubenswrapper[13046]: E0308 03:18:35.954267 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-j6jpn_openshift-cluster-storage-operator(555ae3b4-71c6-4b62-9e09-66a58ae4c6ad)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" podUID="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" Mar 08 03:18:35.955803 master-0 kubenswrapper[13046]: I0308 03:18:35.955765 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/1.log" Mar 08 03:18:35.958442 master-0 kubenswrapper[13046]: I0308 03:18:35.958401 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/0.log" Mar 08 03:18:35.958611 master-0 kubenswrapper[13046]: I0308 03:18:35.958469 13046 generic.go:334] "Generic (PLEG): container finished" podID="70fba73e-c201-4866-bc69-64892ea5bdca" containerID="0d4bbc1c7dcdc0d723f39e3f9d1635f91be7a55d561224fda15ad22d37e6f16a" exitCode=255 Mar 08 03:18:35.958611 master-0 kubenswrapper[13046]: I0308 03:18:35.958559 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerDied","Data":"0d4bbc1c7dcdc0d723f39e3f9d1635f91be7a55d561224fda15ad22d37e6f16a"} Mar 08 03:18:35.959164 master-0 kubenswrapper[13046]: I0308 03:18:35.959110 13046 scope.go:117] "RemoveContainer" containerID="0d4bbc1c7dcdc0d723f39e3f9d1635f91be7a55d561224fda15ad22d37e6f16a" Mar 08 03:18:35.959474 master-0 kubenswrapper[13046]: E0308 03:18:35.959428 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-8565d84698-qsgq7_openshift-controller-manager-operator(70fba73e-c201-4866-bc69-64892ea5bdca)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" podUID="70fba73e-c201-4866-bc69-64892ea5bdca" Mar 08 03:18:35.964620 master-0 kubenswrapper[13046]: I0308 03:18:35.964550 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-8fxl8_ba9496ed-060e-4118-9da6-89b82bd49263/csi-snapshot-controller-operator/1.log" Mar 08 03:18:35.965297 master-0 kubenswrapper[13046]: I0308 03:18:35.965239 13046 generic.go:334] "Generic (PLEG): container finished" podID="ba9496ed-060e-4118-9da6-89b82bd49263" containerID="d7bbde25e7a29335f2b74975574f130c47ac2f7c63ea1d074484ef4e53a9352d" exitCode=255 Mar 08 03:18:35.965705 master-0 kubenswrapper[13046]: I0308 03:18:35.965326 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" event={"ID":"ba9496ed-060e-4118-9da6-89b82bd49263","Type":"ContainerDied","Data":"d7bbde25e7a29335f2b74975574f130c47ac2f7c63ea1d074484ef4e53a9352d"} Mar 08 03:18:35.966178 master-0 kubenswrapper[13046]: I0308 03:18:35.966142 13046 scope.go:117] "RemoveContainer" containerID="d7bbde25e7a29335f2b74975574f130c47ac2f7c63ea1d074484ef4e53a9352d" Mar 08 03:18:35.966448 master-0 kubenswrapper[13046]: E0308 03:18:35.966406 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=csi-snapshot-controller-operator pod=csi-snapshot-controller-operator-5685fbc7d-8fxl8_openshift-cluster-storage-operator(ba9496ed-060e-4118-9da6-89b82bd49263)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" podUID="ba9496ed-060e-4118-9da6-89b82bd49263" Mar 08 03:18:35.968107 master-0 kubenswrapper[13046]: I0308 03:18:35.968057 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-zqlnx_f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/authentication-operator/1.log" Mar 08 03:18:35.968823 master-0 kubenswrapper[13046]: I0308 03:18:35.968757 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerDied","Data":"5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b"} Mar 08 03:18:35.968996 master-0 kubenswrapper[13046]: I0308 03:18:35.968650 13046 generic.go:334] "Generic (PLEG): container finished" podID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" containerID="5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b" exitCode=255 Mar 08 03:18:35.969287 master-0 kubenswrapper[13046]: I0308 03:18:35.969249 13046 scope.go:117] "RemoveContainer" containerID="5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b" Mar 08 03:18:35.969640 master-0 kubenswrapper[13046]: E0308 03:18:35.969597 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=authentication-operator pod=authentication-operator-7c6989d6c4-zqlnx_openshift-authentication-operator(f08a644f-3b61-46a7-a7b6-a9f7f2f7d266)\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" Mar 08 03:18:35.971409 master-0 kubenswrapper[13046]: I0308 03:18:35.971332 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-krqcr_6432d23b-a55a-4131-83d5-5f16419809dd/openshift-apiserver-operator/1.log" Mar 08 03:18:35.971896 master-0 kubenswrapper[13046]: I0308 03:18:35.971689 13046 generic.go:334] "Generic (PLEG): container finished" podID="6432d23b-a55a-4131-83d5-5f16419809dd" containerID="6190833c9f0b16bfe58201dc6ffba8cfe78f629e2c29aa82c2bfdfc57bb89f22" exitCode=255 Mar 08 03:18:35.971896 master-0 kubenswrapper[13046]: I0308 03:18:35.971752 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerDied","Data":"6190833c9f0b16bfe58201dc6ffba8cfe78f629e2c29aa82c2bfdfc57bb89f22"} Mar 08 03:18:35.972284 master-0 kubenswrapper[13046]: I0308 03:18:35.972159 13046 scope.go:117] "RemoveContainer" containerID="6190833c9f0b16bfe58201dc6ffba8cfe78f629e2c29aa82c2bfdfc57bb89f22" Mar 08 03:18:35.972365 master-0 kubenswrapper[13046]: E0308 03:18:35.972330 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-799b6db4d7-krqcr_openshift-apiserver-operator(6432d23b-a55a-4131-83d5-5f16419809dd)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" podUID="6432d23b-a55a-4131-83d5-5f16419809dd" Mar 08 03:18:35.977778 master-0 kubenswrapper[13046]: I0308 03:18:35.977104 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6fbc9556d8-l758n_9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/route-controller-manager/1.log" Mar 08 03:18:35.979242 master-0 kubenswrapper[13046]: I0308 03:18:35.979127 13046 generic.go:334] "Generic (PLEG): container finished" podID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" exitCode=255 Mar 08 03:18:35.979555 master-0 kubenswrapper[13046]: I0308 03:18:35.979274 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerDied","Data":"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6"} Mar 08 03:18:35.980231 master-0 kubenswrapper[13046]: I0308 03:18:35.980196 13046 scope.go:117] "RemoveContainer" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:18:35.980614 master-0 kubenswrapper[13046]: E0308 03:18:35.980560 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=route-controller-manager pod=route-controller-manager-6fbc9556d8-l758n_openshift-route-controller-manager(9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5)\"" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" Mar 08 03:18:35.982854 master-0 kubenswrapper[13046]: I0308 03:18:35.982814 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-h4ldq_7d23557f-6bb1-46ce-a56e-d0011c576125/cluster-olm-operator/1.log" Mar 08 03:18:35.984362 master-0 kubenswrapper[13046]: I0308 03:18:35.984272 13046 generic.go:334] "Generic (PLEG): container finished" podID="7d23557f-6bb1-46ce-a56e-d0011c576125" containerID="b70730e593cef55ada178b0edb721bde407e4ab726f8f341648fd239c8fd9e8b" exitCode=255 Mar 08 03:18:35.984362 master-0 kubenswrapper[13046]: I0308 03:18:35.984332 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerDied","Data":"b70730e593cef55ada178b0edb721bde407e4ab726f8f341648fd239c8fd9e8b"} Mar 08 03:18:35.984979 master-0 kubenswrapper[13046]: I0308 03:18:35.984942 13046 scope.go:117] "RemoveContainer" containerID="b70730e593cef55ada178b0edb721bde407e4ab726f8f341648fd239c8fd9e8b" Mar 08 03:18:35.985298 master-0 kubenswrapper[13046]: E0308 03:18:35.985167 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-77899cf6d-h4ldq_openshift-cluster-olm-operator(7d23557f-6bb1-46ce-a56e-d0011c576125)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" podUID="7d23557f-6bb1-46ce-a56e-d0011c576125" Mar 08 03:18:35.986868 master-0 kubenswrapper[13046]: I0308 03:18:35.986757 13046 scope.go:117] "RemoveContainer" containerID="7f6f31b4f0d02a09a89d71c35178fb3df4ada57089a0aa97a2782b017503d19e" Mar 08 03:18:35.988007 master-0 kubenswrapper[13046]: I0308 03:18:35.987931 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-57b4v_c0a08ddb-1045-4631-ba52-93f3046ebd0a/service-ca-operator/1.log" Mar 08 03:18:35.988721 master-0 kubenswrapper[13046]: I0308 03:18:35.988687 13046 generic.go:334] "Generic (PLEG): container finished" podID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" containerID="1cb76e3cff18055c242f15145cff75d29ff51e57b0a87a830e5705157c3314d3" exitCode=255 Mar 08 03:18:35.988887 master-0 kubenswrapper[13046]: I0308 03:18:35.988823 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerDied","Data":"1cb76e3cff18055c242f15145cff75d29ff51e57b0a87a830e5705157c3314d3"} Mar 08 03:18:35.989708 master-0 kubenswrapper[13046]: I0308 03:18:35.989643 13046 scope.go:117] "RemoveContainer" containerID="1cb76e3cff18055c242f15145cff75d29ff51e57b0a87a830e5705157c3314d3" Mar 08 03:18:35.990920 master-0 kubenswrapper[13046]: E0308 03:18:35.990857 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-69b6fc6b88-57b4v_openshift-service-ca-operator(c0a08ddb-1045-4631-ba52-93f3046ebd0a)\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" podUID="c0a08ddb-1045-4631-ba52-93f3046ebd0a" Mar 08 03:18:35.991607 master-0 kubenswrapper[13046]: I0308 03:18:35.991546 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-8wv6c_e71caa06-6ce7-47c9-a267-21f6b6af9247/kube-storage-version-migrator-operator/1.log" Mar 08 03:18:35.992356 master-0 kubenswrapper[13046]: I0308 03:18:35.992311 13046 generic.go:334] "Generic (PLEG): container finished" podID="e71caa06-6ce7-47c9-a267-21f6b6af9247" containerID="96a9a66bbda3ed971dd7e07c7aa61fb398502ffd9a81746e62e5bce080cd2621" exitCode=255 Mar 08 03:18:35.992647 master-0 kubenswrapper[13046]: I0308 03:18:35.992395 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerDied","Data":"96a9a66bbda3ed971dd7e07c7aa61fb398502ffd9a81746e62e5bce080cd2621"} Mar 08 03:18:35.993087 master-0 kubenswrapper[13046]: I0308 03:18:35.993043 13046 scope.go:117] "RemoveContainer" containerID="96a9a66bbda3ed971dd7e07c7aa61fb398502ffd9a81746e62e5bce080cd2621" Mar 08 03:18:35.993529 master-0 kubenswrapper[13046]: E0308 03:18:35.993394 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-7f65c457f5-8wv6c_openshift-kube-storage-version-migrator-operator(e71caa06-6ce7-47c9-a267-21f6b6af9247)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" podUID="e71caa06-6ce7-47c9-a267-21f6b6af9247" Mar 08 03:18:35.996591 master-0 kubenswrapper[13046]: I0308 03:18:35.996134 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-gfmq4_8c0192f3-2e60-42c6-9836-c70a9fa407d5/etcd-operator/1.log" Mar 08 03:18:35.998159 master-0 kubenswrapper[13046]: I0308 03:18:35.998123 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-sjdgk_aadbbe97-2a03-40da-846d-252e29661f67/kube-controller-manager-operator/1.log" Mar 08 03:18:36.001881 master-0 kubenswrapper[13046]: I0308 03:18:36.001752 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-hg2f6_3178dfc0-a35e-418e-a954-cd919b8af88c/kube-apiserver-operator/1.log" Mar 08 03:18:36.022559 master-0 kubenswrapper[13046]: I0308 03:18:36.022354 13046 scope.go:117] "RemoveContainer" containerID="73db2b17db7b45f368583714c7423ad3baed3f0e6461afd93878b41dc72e8454" Mar 08 03:18:36.051960 master-0 kubenswrapper[13046]: I0308 03:18:36.051773 13046 scope.go:117] "RemoveContainer" containerID="d901422733644b9a69bd0914635930a2d55c9786ff5a015eee041ee28b2a4386" Mar 08 03:18:36.071802 master-0 kubenswrapper[13046]: I0308 03:18:36.071746 13046 scope.go:117] "RemoveContainer" containerID="d0f0f4b219b4ac3bb779dcf407e2823f27cc1730a15285e76ae166723d766c7a" Mar 08 03:18:36.095839 master-0 kubenswrapper[13046]: I0308 03:18:36.095806 13046 scope.go:117] "RemoveContainer" containerID="ad59cc4c7958a82cb7e8357828383997f6ce39b4d62e09c7ada95209a7513c90" Mar 08 03:18:36.121318 master-0 kubenswrapper[13046]: I0308 03:18:36.121222 13046 scope.go:117] "RemoveContainer" containerID="37ec7f6b3aeafa0c1aa240a3f289ec19e14a9c93e8dc0c62d0b70aca6f9a3fcf" Mar 08 03:18:36.154883 master-0 kubenswrapper[13046]: I0308 03:18:36.154827 13046 scope.go:117] "RemoveContainer" containerID="1a20bdbedb5b13853225f367842b80deec1d4120a3bc963794fd1350f7fbce22" Mar 08 03:18:36.230568 master-0 kubenswrapper[13046]: I0308 03:18:36.230525 13046 scope.go:117] "RemoveContainer" containerID="5d6cef5dfa287b776ed75e1b3fc50d94a08caa2b191b2d267b798b1ef6c944f2" Mar 08 03:18:37.016566 master-0 kubenswrapper[13046]: I0308 03:18:37.016417 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-h4ldq_7d23557f-6bb1-46ce-a56e-d0011c576125/cluster-olm-operator/1.log" Mar 08 03:18:37.021096 master-0 kubenswrapper[13046]: I0308 03:18:37.021025 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/1.log" Mar 08 03:18:37.024546 master-0 kubenswrapper[13046]: I0308 03:18:37.024475 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-8fxl8_ba9496ed-060e-4118-9da6-89b82bd49263/csi-snapshot-controller-operator/1.log" Mar 08 03:18:37.027338 master-0 kubenswrapper[13046]: I0308 03:18:37.027284 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-8wv6c_e71caa06-6ce7-47c9-a267-21f6b6af9247/kube-storage-version-migrator-operator/1.log" Mar 08 03:18:37.029896 master-0 kubenswrapper[13046]: I0308 03:18:37.029834 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-zqlnx_f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/authentication-operator/1.log" Mar 08 03:18:37.032190 master-0 kubenswrapper[13046]: I0308 03:18:37.032122 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-krqcr_6432d23b-a55a-4131-83d5-5f16419809dd/openshift-apiserver-operator/1.log" Mar 08 03:18:37.034716 master-0 kubenswrapper[13046]: I0308 03:18:37.034664 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-84bfdbbb7f-fqhlq_c9f377bf-79c5-4425-b5d1-256961835f62/service-ca-controller/1.log" Mar 08 03:18:37.035358 master-0 kubenswrapper[13046]: I0308 03:18:37.035272 13046 generic.go:334] "Generic (PLEG): container finished" podID="c9f377bf-79c5-4425-b5d1-256961835f62" containerID="f64fa3836d9bdc65eec219bc3a2a3624ab55436c5e1f3236c3b13b54753839d7" exitCode=255 Mar 08 03:18:37.035539 master-0 kubenswrapper[13046]: I0308 03:18:37.035382 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" event={"ID":"c9f377bf-79c5-4425-b5d1-256961835f62","Type":"ContainerDied","Data":"f64fa3836d9bdc65eec219bc3a2a3624ab55436c5e1f3236c3b13b54753839d7"} Mar 08 03:18:37.035539 master-0 kubenswrapper[13046]: I0308 03:18:37.035439 13046 scope.go:117] "RemoveContainer" containerID="d4ea1844b53b95e64939abf18bf680af5d21c94a78af3eaf8fa2b814c48bf2f0" Mar 08 03:18:37.036369 master-0 kubenswrapper[13046]: I0308 03:18:37.036278 13046 scope.go:117] "RemoveContainer" containerID="f64fa3836d9bdc65eec219bc3a2a3624ab55436c5e1f3236c3b13b54753839d7" Mar 08 03:18:37.036965 master-0 kubenswrapper[13046]: E0308 03:18:37.036864 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-controller pod=service-ca-84bfdbbb7f-fqhlq_openshift-service-ca(c9f377bf-79c5-4425-b5d1-256961835f62)\"" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" podUID="c9f377bf-79c5-4425-b5d1-256961835f62" Mar 08 03:18:37.040124 master-0 kubenswrapper[13046]: I0308 03:18:37.040068 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/1.log" Mar 08 03:18:37.043326 master-0 kubenswrapper[13046]: I0308 03:18:37.043206 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-57b4v_c0a08ddb-1045-4631-ba52-93f3046ebd0a/service-ca-operator/1.log" Mar 08 03:18:37.046274 master-0 kubenswrapper[13046]: I0308 03:18:37.046219 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-4qmzb_0e569889-4759-4046-b0ed-e550078521c6/cluster-storage-operator/0.log" Mar 08 03:18:37.046448 master-0 kubenswrapper[13046]: I0308 03:18:37.046302 13046 generic.go:334] "Generic (PLEG): container finished" podID="0e569889-4759-4046-b0ed-e550078521c6" containerID="6b51f1708089850a9ea78dd273ccd1385bae0a1e4f76c214ab47f81e5aa40314" exitCode=255 Mar 08 03:18:37.046448 master-0 kubenswrapper[13046]: I0308 03:18:37.046428 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" event={"ID":"0e569889-4759-4046-b0ed-e550078521c6","Type":"ContainerDied","Data":"6b51f1708089850a9ea78dd273ccd1385bae0a1e4f76c214ab47f81e5aa40314"} Mar 08 03:18:37.047203 master-0 kubenswrapper[13046]: I0308 03:18:37.047136 13046 scope.go:117] "RemoveContainer" containerID="6b51f1708089850a9ea78dd273ccd1385bae0a1e4f76c214ab47f81e5aa40314" Mar 08 03:18:37.050315 master-0 kubenswrapper[13046]: I0308 03:18:37.049604 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-hw2kt_b83ab56c-e28d-4e82-ae8f-92649a1448ed/kube-scheduler-operator-container/1.log" Mar 08 03:18:37.054551 master-0 kubenswrapper[13046]: I0308 03:18:37.053128 13046 generic.go:334] "Generic (PLEG): container finished" podID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" containerID="f1b9217db8d5e19f2cb58c10ee2d605393c537b6451319c583b6c04aabf8378e" exitCode=255 Mar 08 03:18:37.054551 master-0 kubenswrapper[13046]: I0308 03:18:37.053240 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerDied","Data":"f1b9217db8d5e19f2cb58c10ee2d605393c537b6451319c583b6c04aabf8378e"} Mar 08 03:18:37.054551 master-0 kubenswrapper[13046]: I0308 03:18:37.053837 13046 scope.go:117] "RemoveContainer" containerID="f1b9217db8d5e19f2cb58c10ee2d605393c537b6451319c583b6c04aabf8378e" Mar 08 03:18:37.054551 master-0 kubenswrapper[13046]: E0308 03:18:37.054160 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5c74bfc494-hw2kt_openshift-kube-scheduler-operator(b83ab56c-e28d-4e82-ae8f-92649a1448ed)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" podUID="b83ab56c-e28d-4e82-ae8f-92649a1448ed" Mar 08 03:18:37.056992 master-0 kubenswrapper[13046]: I0308 03:18:37.056892 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6fbc9556d8-l758n_9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/route-controller-manager/1.log" Mar 08 03:18:37.075817 master-0 kubenswrapper[13046]: I0308 03:18:37.075776 13046 scope.go:117] "RemoveContainer" containerID="ab858aba9fe747164d134176fff1d99d6f77b5114eeaf6f38c2480128cb7485f" Mar 08 03:18:37.119532 master-0 kubenswrapper[13046]: I0308 03:18:37.118523 13046 scope.go:117] "RemoveContainer" containerID="165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d" Mar 08 03:18:37.121291 master-0 kubenswrapper[13046]: I0308 03:18:37.121238 13046 scope.go:117] "RemoveContainer" containerID="a9d2b3dad7c5e8267e04a16754013d54c3238e5662f29d7e4e345f61b520e85d" Mar 08 03:18:37.121563 master-0 kubenswrapper[13046]: E0308 03:18:37.121480 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-cpnw6_openshift-machine-api(17eaab63-9ba9-4a4a-891d-a76aa3f03b46)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" podUID="17eaab63-9ba9-4a4a-891d-a76aa3f03b46" Mar 08 03:18:38.067371 master-0 kubenswrapper[13046]: I0308 03:18:38.067246 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-84bfdbbb7f-fqhlq_c9f377bf-79c5-4425-b5d1-256961835f62/service-ca-controller/1.log" Mar 08 03:18:38.070990 master-0 kubenswrapper[13046]: I0308 03:18:38.070919 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-4qmzb_0e569889-4759-4046-b0ed-e550078521c6/cluster-storage-operator/0.log" Mar 08 03:18:38.071203 master-0 kubenswrapper[13046]: I0308 03:18:38.071158 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-4qmzb" event={"ID":"0e569889-4759-4046-b0ed-e550078521c6","Type":"ContainerStarted","Data":"ddd55f1ca5a4c865abf70b12bcc55ba2ff019ab9db3744776af2614cee85806f"} Mar 08 03:18:38.075696 master-0 kubenswrapper[13046]: I0308 03:18:38.075637 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-hw2kt_b83ab56c-e28d-4e82-ae8f-92649a1448ed/kube-scheduler-operator-container/1.log" Mar 08 03:18:38.078646 master-0 kubenswrapper[13046]: I0308 03:18:38.078451 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/3.log" Mar 08 03:18:38.079432 master-0 kubenswrapper[13046]: I0308 03:18:38.079382 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/2.log" Mar 08 03:18:38.080298 master-0 kubenswrapper[13046]: I0308 03:18:38.080247 13046 generic.go:334] "Generic (PLEG): container finished" podID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" exitCode=1 Mar 08 03:18:38.080773 master-0 kubenswrapper[13046]: I0308 03:18:38.080370 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerDied","Data":"408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814"} Mar 08 03:18:38.080773 master-0 kubenswrapper[13046]: I0308 03:18:38.080467 13046 scope.go:117] "RemoveContainer" containerID="165094a2fe316b3d352ca17316191186ce93ec59e0da03b35da27a01eedbe18d" Mar 08 03:18:38.081442 master-0 kubenswrapper[13046]: I0308 03:18:38.081373 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:18:38.081832 master-0 kubenswrapper[13046]: E0308 03:18:38.081768 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:39.093600 master-0 kubenswrapper[13046]: I0308 03:18:39.093477 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/3.log" Mar 08 03:18:41.118184 master-0 kubenswrapper[13046]: I0308 03:18:41.118139 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:41.119431 master-0 kubenswrapper[13046]: E0308 03:18:41.119393 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:42.119253 master-0 kubenswrapper[13046]: I0308 03:18:42.119170 13046 scope.go:117] "RemoveContainer" containerID="722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f" Mar 08 03:18:42.120200 master-0 kubenswrapper[13046]: E0308 03:18:42.119555 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-64488f9d78-zg4zr_openshift-config-operator(3bf93333-b537-4f23-9c77-6a245b290fe3)\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" podUID="3bf93333-b537-4f23-9c77-6a245b290fe3" Mar 08 03:18:43.119417 master-0 kubenswrapper[13046]: I0308 03:18:43.118677 13046 scope.go:117] "RemoveContainer" containerID="6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7" Mar 08 03:18:43.681580 master-0 kubenswrapper[13046]: I0308 03:18:43.681395 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:43.682247 master-0 kubenswrapper[13046]: I0308 03:18:43.682157 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:18:43.682407 master-0 kubenswrapper[13046]: I0308 03:18:43.682254 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:18:43.682711 master-0 kubenswrapper[13046]: E0308 03:18:43.682581 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:44.134296 master-0 kubenswrapper[13046]: I0308 03:18:44.134241 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/3.log" Mar 08 03:18:44.135109 master-0 kubenswrapper[13046]: I0308 03:18:44.135069 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/2.log" Mar 08 03:18:44.135981 master-0 kubenswrapper[13046]: I0308 03:18:44.135856 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" containerID="523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84" exitCode=1 Mar 08 03:18:44.135981 master-0 kubenswrapper[13046]: I0308 03:18:44.135910 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerDied","Data":"523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84"} Mar 08 03:18:44.136094 master-0 kubenswrapper[13046]: I0308 03:18:44.136005 13046 scope.go:117] "RemoveContainer" containerID="6b0c5704d344cf0783d51468d328147a058bdde2588e91ebed0ca17bb7b867f7" Mar 08 03:18:44.136661 master-0 kubenswrapper[13046]: I0308 03:18:44.136635 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:18:44.136784 master-0 kubenswrapper[13046]: I0308 03:18:44.136753 13046 scope.go:117] "RemoveContainer" containerID="523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84" Mar 08 03:18:44.136929 master-0 kubenswrapper[13046]: E0308 03:18:44.136899 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:44.137350 master-0 kubenswrapper[13046]: E0308 03:18:44.137315 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:18:44.291637 master-0 kubenswrapper[13046]: I0308 03:18:44.291574 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:18:44.292409 master-0 kubenswrapper[13046]: I0308 03:18:44.292376 13046 scope.go:117] "RemoveContainer" containerID="db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824" Mar 08 03:18:44.292776 master-0 kubenswrapper[13046]: E0308 03:18:44.292738 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=operator-controller-controller-manager-6598bfb6c4-62spv_openshift-operator-controller(8e1af4e8-2ade-48b3-8c56-0ab78f77fac9)\"" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" podUID="8e1af4e8-2ade-48b3-8c56-0ab78f77fac9" Mar 08 03:18:44.584408 master-0 kubenswrapper[13046]: I0308 03:18:44.584314 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" Mar 08 03:18:44.584936 master-0 kubenswrapper[13046]: I0308 03:18:44.584902 13046 scope.go:117] "RemoveContainer" containerID="5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b" Mar 08 03:18:44.585099 master-0 kubenswrapper[13046]: E0308 03:18:44.585072 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=authentication-operator pod=authentication-operator-7c6989d6c4-zqlnx_openshift-authentication-operator(f08a644f-3b61-46a7-a7b6-a9f7f2f7d266)\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" podUID="f08a644f-3b61-46a7-a7b6-a9f7f2f7d266" Mar 08 03:18:44.593069 master-0 kubenswrapper[13046]: I0308 03:18:44.593007 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:44.593706 master-0 kubenswrapper[13046]: I0308 03:18:44.593505 13046 scope.go:117] "RemoveContainer" containerID="3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604" Mar 08 03:18:44.593706 master-0 kubenswrapper[13046]: E0308 03:18:44.593647 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=catalogd-controller-manager-7f8b8b6f4c-gdwg9_openshift-catalogd(53254b19-b5b3-4f97-bc64-37be8b2a41b7)\"" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" podUID="53254b19-b5b3-4f97-bc64-37be8b2a41b7" Mar 08 03:18:44.627123 master-0 kubenswrapper[13046]: I0308 03:18:44.626951 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:44.628690 master-0 kubenswrapper[13046]: I0308 03:18:44.628656 13046 scope.go:117] "RemoveContainer" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:18:44.629030 master-0 kubenswrapper[13046]: E0308 03:18:44.628966 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=route-controller-manager pod=route-controller-manager-6fbc9556d8-l758n_openshift-route-controller-manager(9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5)\"" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" Mar 08 03:18:45.148644 master-0 kubenswrapper[13046]: I0308 03:18:45.148548 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/3.log" Mar 08 03:18:46.118996 master-0 kubenswrapper[13046]: I0308 03:18:46.118912 13046 scope.go:117] "RemoveContainer" containerID="35eb39f11e0262a7f614f3efd65f912a40d31232432eefb143835967713072aa" Mar 08 03:18:47.118366 master-0 kubenswrapper[13046]: I0308 03:18:47.118276 13046 scope.go:117] "RemoveContainer" containerID="38d50ee052cbf3d522047ab11f11ab5aceb271a88779bde26e4fce4c2a1ce3bf" Mar 08 03:18:47.119255 master-0 kubenswrapper[13046]: I0308 03:18:47.119182 13046 scope.go:117] "RemoveContainer" containerID="1cb76e3cff18055c242f15145cff75d29ff51e57b0a87a830e5705157c3314d3" Mar 08 03:18:47.120139 master-0 kubenswrapper[13046]: I0308 03:18:47.120088 13046 scope.go:117] "RemoveContainer" containerID="333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12" Mar 08 03:18:47.121089 master-0 kubenswrapper[13046]: I0308 03:18:47.121026 13046 scope.go:117] "RemoveContainer" containerID="0d4bbc1c7dcdc0d723f39e3f9d1635f91be7a55d561224fda15ad22d37e6f16a" Mar 08 03:18:47.193697 master-0 kubenswrapper[13046]: I0308 03:18:47.193615 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-sjdgk_aadbbe97-2a03-40da-846d-252e29661f67/kube-controller-manager-operator/1.log" Mar 08 03:18:47.193941 master-0 kubenswrapper[13046]: I0308 03:18:47.193710 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-sjdgk" event={"ID":"aadbbe97-2a03-40da-846d-252e29661f67","Type":"ContainerStarted","Data":"a1073693b3a5d4865ef1ab501d32a2c2995b9e94ebad59f7a5c6dd146803949c"} Mar 08 03:18:48.123829 master-0 kubenswrapper[13046]: I0308 03:18:48.123777 13046 scope.go:117] "RemoveContainer" containerID="af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09" Mar 08 03:18:48.124473 master-0 kubenswrapper[13046]: I0308 03:18:48.123991 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:18:48.124473 master-0 kubenswrapper[13046]: I0308 03:18:48.124331 13046 scope.go:117] "RemoveContainer" containerID="f64fa3836d9bdc65eec219bc3a2a3624ab55436c5e1f3236c3b13b54753839d7" Mar 08 03:18:48.124596 master-0 kubenswrapper[13046]: E0308 03:18:48.124333 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:18:48.204276 master-0 kubenswrapper[13046]: I0308 03:18:48.204186 13046 generic.go:334] "Generic (PLEG): container finished" podID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" exitCode=0 Mar 08 03:18:48.204532 master-0 kubenswrapper[13046]: I0308 03:18:48.204303 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerDied","Data":"6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6"} Mar 08 03:18:48.204532 master-0 kubenswrapper[13046]: I0308 03:18:48.204356 13046 scope.go:117] "RemoveContainer" containerID="333ef2dec35fc6ff14f2f1a2c4c625539ba58120d714f2c5f199e8d01f3c0e12" Mar 08 03:18:48.205115 master-0 kubenswrapper[13046]: I0308 03:18:48.205069 13046 scope.go:117] "RemoveContainer" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" Mar 08 03:18:48.205416 master-0 kubenswrapper[13046]: E0308 03:18:48.205384 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:18:48.209265 master-0 kubenswrapper[13046]: I0308 03:18:48.208710 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-gfmq4_8c0192f3-2e60-42c6-9836-c70a9fa407d5/etcd-operator/1.log" Mar 08 03:18:48.209265 master-0 kubenswrapper[13046]: I0308 03:18:48.208862 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-gfmq4" event={"ID":"8c0192f3-2e60-42c6-9836-c70a9fa407d5","Type":"ContainerStarted","Data":"d8fbf4b6226449f237a2185db4bf538cabd306f2a2a7dd1540a54c16120cf2c4"} Mar 08 03:18:48.218795 master-0 kubenswrapper[13046]: I0308 03:18:48.218737 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-qsgq7_70fba73e-c201-4866-bc69-64892ea5bdca/openshift-controller-manager-operator/1.log" Mar 08 03:18:48.219071 master-0 kubenswrapper[13046]: I0308 03:18:48.218927 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-qsgq7" event={"ID":"70fba73e-c201-4866-bc69-64892ea5bdca","Type":"ContainerStarted","Data":"2b64b9c111433c3a8ff27cea7b4684e61824bbc47396a334db4ae5c712631697"} Mar 08 03:18:48.223320 master-0 kubenswrapper[13046]: I0308 03:18:48.223220 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69b6fc6b88-57b4v_c0a08ddb-1045-4631-ba52-93f3046ebd0a/service-ca-operator/1.log" Mar 08 03:18:48.223320 master-0 kubenswrapper[13046]: I0308 03:18:48.223304 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-57b4v" event={"ID":"c0a08ddb-1045-4631-ba52-93f3046ebd0a","Type":"ContainerStarted","Data":"2c2369fb3677fc42801bc4407c4b111ae704b9e60a3d1140fcd31fe5776de3c2"} Mar 08 03:18:49.118561 master-0 kubenswrapper[13046]: I0308 03:18:49.118509 13046 scope.go:117] "RemoveContainer" containerID="d7bbde25e7a29335f2b74975574f130c47ac2f7c63ea1d074484ef4e53a9352d" Mar 08 03:18:49.231453 master-0 kubenswrapper[13046]: I0308 03:18:49.231410 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-84bfdbbb7f-fqhlq_c9f377bf-79c5-4425-b5d1-256961835f62/service-ca-controller/1.log" Mar 08 03:18:49.231936 master-0 kubenswrapper[13046]: I0308 03:18:49.231504 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-fqhlq" event={"ID":"c9f377bf-79c5-4425-b5d1-256961835f62","Type":"ContainerStarted","Data":"3f024589860b6ace684bf7cb427704a78c44fe0838ec3b09921d36475d787e62"} Mar 08 03:18:49.234977 master-0 kubenswrapper[13046]: I0308 03:18:49.234944 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/1.log" Mar 08 03:18:49.235044 master-0 kubenswrapper[13046]: I0308 03:18:49.235007 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerStarted","Data":"470f17a045f36232ef4552e6c7cc64d2eb892c434c99999582d5fb1ef0f7249c"} Mar 08 03:18:49.237990 master-0 kubenswrapper[13046]: I0308 03:18:49.237955 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-8fxl8_ba9496ed-060e-4118-9da6-89b82bd49263/csi-snapshot-controller-operator/1.log" Mar 08 03:18:49.238050 master-0 kubenswrapper[13046]: I0308 03:18:49.237997 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-8fxl8" event={"ID":"ba9496ed-060e-4118-9da6-89b82bd49263","Type":"ContainerStarted","Data":"733f3b0b1544a50afe4f6fb14ad21eb60915ea7ed62fe1a1e47cfa8ccd6154f5"} Mar 08 03:18:50.119069 master-0 kubenswrapper[13046]: I0308 03:18:50.118971 13046 scope.go:117] "RemoveContainer" containerID="1ab4d8b69b478fc835cd8fd3fc069a281c945a4405e054f4403e41111399ed35" Mar 08 03:18:50.119374 master-0 kubenswrapper[13046]: I0308 03:18:50.119343 13046 scope.go:117] "RemoveContainer" containerID="6190833c9f0b16bfe58201dc6ffba8cfe78f629e2c29aa82c2bfdfc57bb89f22" Mar 08 03:18:51.118689 master-0 kubenswrapper[13046]: I0308 03:18:51.118598 13046 scope.go:117] "RemoveContainer" containerID="96a9a66bbda3ed971dd7e07c7aa61fb398502ffd9a81746e62e5bce080cd2621" Mar 08 03:18:51.119288 master-0 kubenswrapper[13046]: I0308 03:18:51.118795 13046 scope.go:117] "RemoveContainer" containerID="a9d2b3dad7c5e8267e04a16754013d54c3238e5662f29d7e4e345f61b520e85d" Mar 08 03:18:51.119288 master-0 kubenswrapper[13046]: I0308 03:18:51.118970 13046 scope.go:117] "RemoveContainer" containerID="b70730e593cef55ada178b0edb721bde407e4ab726f8f341648fd239c8fd9e8b" Mar 08 03:18:51.266249 master-0 kubenswrapper[13046]: I0308 03:18:51.266181 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-hg2f6_3178dfc0-a35e-418e-a954-cd919b8af88c/kube-apiserver-operator/1.log" Mar 08 03:18:51.266530 master-0 kubenswrapper[13046]: I0308 03:18:51.266421 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-hg2f6" event={"ID":"3178dfc0-a35e-418e-a954-cd919b8af88c","Type":"ContainerStarted","Data":"aee8d526b1a79b81d8175005f0fc5e5f184950d9b9d212c771e1ddf33be67881"} Mar 08 03:18:51.271599 master-0 kubenswrapper[13046]: I0308 03:18:51.271475 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-krqcr_6432d23b-a55a-4131-83d5-5f16419809dd/openshift-apiserver-operator/1.log" Mar 08 03:18:51.271599 master-0 kubenswrapper[13046]: I0308 03:18:51.271560 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-krqcr" event={"ID":"6432d23b-a55a-4131-83d5-5f16419809dd","Type":"ContainerStarted","Data":"9afbd04046ab847df7174d3fcc1e826e00bce329cebc679d5710683388e53e51"} Mar 08 03:18:52.118633 master-0 kubenswrapper[13046]: I0308 03:18:52.118562 13046 scope.go:117] "RemoveContainer" containerID="f1b9217db8d5e19f2cb58c10ee2d605393c537b6451319c583b6c04aabf8378e" Mar 08 03:18:52.286221 master-0 kubenswrapper[13046]: I0308 03:18:52.286154 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-h4ldq_7d23557f-6bb1-46ce-a56e-d0011c576125/cluster-olm-operator/1.log" Mar 08 03:18:52.287281 master-0 kubenswrapper[13046]: I0308 03:18:52.287210 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-h4ldq" event={"ID":"7d23557f-6bb1-46ce-a56e-d0011c576125","Type":"ContainerStarted","Data":"31a5dc98994a3f5251d9c75dfbc29349ba0d10d80d30ccdd662bfe86bddb810d"} Mar 08 03:18:52.295901 master-0 kubenswrapper[13046]: I0308 03:18:52.295840 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/2.log" Mar 08 03:18:52.296401 master-0 kubenswrapper[13046]: I0308 03:18:52.296298 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-cpnw6" event={"ID":"17eaab63-9ba9-4a4a-891d-a76aa3f03b46","Type":"ContainerStarted","Data":"062a2fef0c72c09e5b175986613b792a12987bdb66d380b3610a7190143fb274"} Mar 08 03:18:52.300229 master-0 kubenswrapper[13046]: I0308 03:18:52.300162 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-7f65c457f5-8wv6c_e71caa06-6ce7-47c9-a267-21f6b6af9247/kube-storage-version-migrator-operator/1.log" Mar 08 03:18:52.300433 master-0 kubenswrapper[13046]: I0308 03:18:52.300278 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-8wv6c" event={"ID":"e71caa06-6ce7-47c9-a267-21f6b6af9247","Type":"ContainerStarted","Data":"c0ac2f86af3e9d12b218e2268a41092b2f7d36e1d7231f380f174f10c1249c24"} Mar 08 03:18:53.118924 master-0 kubenswrapper[13046]: I0308 03:18:53.118852 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:18:53.119459 master-0 kubenswrapper[13046]: E0308 03:18:53.119213 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:18:53.309359 master-0 kubenswrapper[13046]: I0308 03:18:53.309268 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-hw2kt_b83ab56c-e28d-4e82-ae8f-92649a1448ed/kube-scheduler-operator-container/1.log" Mar 08 03:18:53.309359 master-0 kubenswrapper[13046]: I0308 03:18:53.309348 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-hw2kt" event={"ID":"b83ab56c-e28d-4e82-ae8f-92649a1448ed","Type":"ContainerStarted","Data":"396602fbc64f92acd46fcf666b76461bddc485426d8f56b1f6830f5d0a873a63"} Mar 08 03:18:56.119008 master-0 kubenswrapper[13046]: I0308 03:18:56.118928 13046 scope.go:117] "RemoveContainer" containerID="722606dfa123691163343f1d630b535e2ac981506caa331fc8462aebea27dd7f" Mar 08 03:18:56.336193 master-0 kubenswrapper[13046]: I0308 03:18:56.336104 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/2.log" Mar 08 03:18:56.336916 master-0 kubenswrapper[13046]: I0308 03:18:56.336837 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" event={"ID":"3bf93333-b537-4f23-9c77-6a245b290fe3","Type":"ContainerStarted","Data":"73efda2545bb54c5b6ce275451a15b7c0bacf543ceeb29b52043ed6ed7864545"} Mar 08 03:18:56.349560 master-0 kubenswrapper[13046]: I0308 03:18:56.349442 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:18:57.118100 master-0 kubenswrapper[13046]: I0308 03:18:57.118014 13046 scope.go:117] "RemoveContainer" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:18:57.347513 master-0 kubenswrapper[13046]: I0308 03:18:57.347435 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6fbc9556d8-l758n_9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/route-controller-manager/1.log" Mar 08 03:18:57.348099 master-0 kubenswrapper[13046]: I0308 03:18:57.347632 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerStarted","Data":"b182ce771137b026d98ba2b5690c6edd71dd016caec91bd289049a210a43602b"} Mar 08 03:18:57.348244 master-0 kubenswrapper[13046]: I0308 03:18:57.348186 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:57.349989 master-0 kubenswrapper[13046]: I0308 03:18:57.349937 13046 patch_prober.go:28] interesting pod/route-controller-manager-6fbc9556d8-l758n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:8443/healthz\": dial tcp 10.128.0.68:8443: connect: connection refused" start-of-body= Mar 08 03:18:57.350052 master-0 kubenswrapper[13046]: I0308 03:18:57.350002 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.68:8443/healthz\": dial tcp 10.128.0.68:8443: connect: connection refused" Mar 08 03:18:58.121515 master-0 kubenswrapper[13046]: I0308 03:18:58.121411 13046 scope.go:117] "RemoveContainer" containerID="3a5fd1f6b94545b8f1ed6f9a672b5c39aaa0710c311b4487cd4760dffc555604" Mar 08 03:18:58.121809 master-0 kubenswrapper[13046]: I0308 03:18:58.121580 13046 scope.go:117] "RemoveContainer" containerID="5e1a6b1b75dd7e94ec8809b92497979f830922f94f2da614dcc13ffdd7ee147b" Mar 08 03:18:58.122014 master-0 kubenswrapper[13046]: I0308 03:18:58.121959 13046 scope.go:117] "RemoveContainer" containerID="523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84" Mar 08 03:18:58.122833 master-0 kubenswrapper[13046]: E0308 03:18:58.122470 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:18:58.356868 master-0 kubenswrapper[13046]: I0308 03:18:58.356821 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-zqlnx_f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/authentication-operator/1.log" Mar 08 03:18:58.364050 master-0 kubenswrapper[13046]: I0308 03:18:58.356936 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-zqlnx" event={"ID":"f08a644f-3b61-46a7-a7b6-a9f7f2f7d266","Type":"ContainerStarted","Data":"a8937c6a68ba50e26516a9be7b875da8b516998ebb894db7b240c55ed014b961"} Mar 08 03:18:58.368128 master-0 kubenswrapper[13046]: I0308 03:18:58.367868 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:18:59.118438 master-0 kubenswrapper[13046]: I0308 03:18:59.118309 13046 scope.go:117] "RemoveContainer" containerID="db345230f982b5275a648807d0d38fa9bd427abbd6589da846240972998d2824" Mar 08 03:18:59.118789 master-0 kubenswrapper[13046]: I0308 03:18:59.118615 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:18:59.118950 master-0 kubenswrapper[13046]: E0308 03:18:59.118910 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:18:59.364644 master-0 kubenswrapper[13046]: I0308 03:18:59.364601 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-gdwg9_53254b19-b5b3-4f97-bc64-37be8b2a41b7/manager/2.log" Mar 08 03:18:59.365136 master-0 kubenswrapper[13046]: I0308 03:18:59.364977 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" event={"ID":"53254b19-b5b3-4f97-bc64-37be8b2a41b7","Type":"ContainerStarted","Data":"dd0ed2b60ff5b7045d38255f5e140608e45bddc31667779f1f22fe7444e2a5c9"} Mar 08 03:18:59.365209 master-0 kubenswrapper[13046]: I0308 03:18:59.365178 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:18:59.366642 master-0 kubenswrapper[13046]: I0308 03:18:59.366614 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-62spv_8e1af4e8-2ade-48b3-8c56-0ab78f77fac9/manager/2.log" Mar 08 03:18:59.367346 master-0 kubenswrapper[13046]: I0308 03:18:59.367293 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" event={"ID":"8e1af4e8-2ade-48b3-8c56-0ab78f77fac9","Type":"ContainerStarted","Data":"f8a6a6125777f487ac5971dda4546e378b5bc880b54814ebcae4be385f798b61"} Mar 08 03:18:59.368050 master-0 kubenswrapper[13046]: I0308 03:18:59.367994 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:19:00.119218 master-0 kubenswrapper[13046]: I0308 03:19:00.119153 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:19:00.380580 master-0 kubenswrapper[13046]: I0308 03:19:00.380426 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/2.log" Mar 08 03:19:00.380580 master-0 kubenswrapper[13046]: I0308 03:19:00.380530 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerStarted","Data":"860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5"} Mar 08 03:19:01.293978 master-0 kubenswrapper[13046]: I0308 03:19:01.293879 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-zg4zr" Mar 08 03:19:03.119007 master-0 kubenswrapper[13046]: I0308 03:19:03.118936 13046 scope.go:117] "RemoveContainer" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" Mar 08 03:19:03.119563 master-0 kubenswrapper[13046]: E0308 03:19:03.119268 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:19:03.674558 master-0 kubenswrapper[13046]: I0308 03:19:03.674450 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:19:03.680527 master-0 kubenswrapper[13046]: I0308 03:19:03.680451 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:19:04.294806 master-0 kubenswrapper[13046]: I0308 03:19:04.294722 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-62spv" Mar 08 03:19:04.595972 master-0 kubenswrapper[13046]: I0308 03:19:04.595901 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-gdwg9" Mar 08 03:19:05.119121 master-0 kubenswrapper[13046]: I0308 03:19:05.118834 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:19:05.121287 master-0 kubenswrapper[13046]: E0308 03:19:05.121206 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: I0308 03:19:05.877731 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: E0308 03:19:05.878043 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: I0308 03:19:05.878066 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: E0308 03:19:05.878084 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0781e6af-f5b5-40f7-bb7f-5bc6978b4957" containerName="installer" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: I0308 03:19:05.878096 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="0781e6af-f5b5-40f7-bb7f-5bc6978b4957" containerName="installer" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: E0308 03:19:05.878146 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: I0308 03:19:05.878159 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: E0308 03:19:05.878183 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:19:05.878171 master-0 kubenswrapper[13046]: I0308 03:19:05.878195 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878226 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc664e3-7f37-4fba-8104-544ffb18c1bd" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878238 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc664e3-7f37-4fba-8104-544ffb18c1bd" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878266 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878278 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878300 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e324f6c-ee4c-42bc-b241-9c6938749854" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878313 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e324f6c-ee4c-42bc-b241-9c6938749854" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878353 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb74744-fb99-4663-a7d0-7bae2db205e9" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878366 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb74744-fb99-4663-a7d0-7bae2db205e9" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878393 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea81472-8a81-45ec-a07d-8710f47a927d" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878406 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea81472-8a81-45ec-a07d-8710f47a927d" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878423 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878537 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878559 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6e3f01-0f22-4961-b450-56aca5477943" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878571 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6e3f01-0f22-4961-b450-56aca5477943" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878598 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234638fe-5577-45bc-9094-907c5611da38" containerName="kube-rbac-proxy" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878610 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="234638fe-5577-45bc-9094-907c5611da38" containerName="kube-rbac-proxy" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878625 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878638 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878661 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05d5093-20f4-42d5-9db3-811e049cc1b6" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878674 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05d5093-20f4-42d5-9db3-811e049cc1b6" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: E0308 03:19:05.878704 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.878716 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879446 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbf59e4-f202-4f7c-9f45-0d07de8e6447" containerName="prober" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879465 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879508 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879545 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879564 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879579 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6e3f01-0f22-4961-b450-56aca5477943" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879597 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05d5093-20f4-42d5-9db3-811e049cc1b6" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879610 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea81472-8a81-45ec-a07d-8710f47a927d" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879625 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="0781e6af-f5b5-40f7-bb7f-5bc6978b4957" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879652 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81d3c37-e8d7-44c8-973e-13992380ce85" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879674 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc664e3-7f37-4fba-8104-544ffb18c1bd" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879687 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e324f6c-ee4c-42bc-b241-9c6938749854" containerName="extract-utilities" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879708 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5" containerName="assisted-installer-controller" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879721 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb74744-fb99-4663-a7d0-7bae2db205e9" containerName="installer" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.879740 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="234638fe-5577-45bc-9094-907c5611da38" containerName="kube-rbac-proxy" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.880303 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.883379 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-68q4j" Mar 08 03:19:05.884791 master-0 kubenswrapper[13046]: I0308 03:19:05.884024 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 08 03:19:05.911420 master-0 kubenswrapper[13046]: I0308 03:19:05.910289 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:19:06.074513 master-0 kubenswrapper[13046]: I0308 03:19:06.074418 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.074792 master-0 kubenswrapper[13046]: I0308 03:19:06.074548 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.074965 master-0 kubenswrapper[13046]: I0308 03:19:06.074884 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.176451 master-0 kubenswrapper[13046]: I0308 03:19:06.176300 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.176733 master-0 kubenswrapper[13046]: I0308 03:19:06.176533 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.176733 master-0 kubenswrapper[13046]: I0308 03:19:06.176601 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.176878 master-0 kubenswrapper[13046]: I0308 03:19:06.176737 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.176878 master-0 kubenswrapper[13046]: I0308 03:19:06.176793 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.205437 master-0 kubenswrapper[13046]: I0308 03:19:06.205361 13046 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 03:19:06.210456 master-0 kubenswrapper[13046]: I0308 03:19:06.210396 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.244367 master-0 kubenswrapper[13046]: I0308 03:19:06.244251 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:06.437302 master-0 kubenswrapper[13046]: I0308 03:19:06.437230 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/1.log" Mar 08 03:19:06.439813 master-0 kubenswrapper[13046]: I0308 03:19:06.439205 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/0.log" Mar 08 03:19:06.439813 master-0 kubenswrapper[13046]: I0308 03:19:06.439299 13046 generic.go:334] "Generic (PLEG): container finished" podID="fd6b827c-70b0-47ed-b07c-c696343248a8" containerID="3eb78859836f3da919c6f295f9ebd383a5b6b693cee1d6fd99889820e2d9696c" exitCode=1 Mar 08 03:19:06.439813 master-0 kubenswrapper[13046]: I0308 03:19:06.439476 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerDied","Data":"3eb78859836f3da919c6f295f9ebd383a5b6b693cee1d6fd99889820e2d9696c"} Mar 08 03:19:06.439813 master-0 kubenswrapper[13046]: I0308 03:19:06.439573 13046 scope.go:117] "RemoveContainer" containerID="927e976b2419f80e2b156dd6620627f0ab5b15535fdab986491afec086084730" Mar 08 03:19:06.440883 master-0 kubenswrapper[13046]: I0308 03:19:06.440416 13046 scope.go:117] "RemoveContainer" containerID="3eb78859836f3da919c6f295f9ebd383a5b6b693cee1d6fd99889820e2d9696c" Mar 08 03:19:06.441169 master-0 kubenswrapper[13046]: E0308 03:19:06.440894 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-677db989d6-r9m2k_openshift-ingress-operator(fd6b827c-70b0-47ed-b07c-c696343248a8)\"" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" podUID="fd6b827c-70b0-47ed-b07c-c696343248a8" Mar 08 03:19:06.448152 master-0 kubenswrapper[13046]: I0308 03:19:06.448032 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/1.log" Mar 08 03:19:06.449630 master-0 kubenswrapper[13046]: I0308 03:19:06.449529 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/0.log" Mar 08 03:19:06.449630 master-0 kubenswrapper[13046]: I0308 03:19:06.449602 13046 generic.go:334] "Generic (PLEG): container finished" podID="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" containerID="82344cdd1de2ca9d97e9556224c6c294e5e614051e31021d5bf08c8ea15a927c" exitCode=1 Mar 08 03:19:06.450171 master-0 kubenswrapper[13046]: I0308 03:19:06.449751 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerDied","Data":"82344cdd1de2ca9d97e9556224c6c294e5e614051e31021d5bf08c8ea15a927c"} Mar 08 03:19:06.450437 master-0 kubenswrapper[13046]: I0308 03:19:06.450366 13046 scope.go:117] "RemoveContainer" containerID="82344cdd1de2ca9d97e9556224c6c294e5e614051e31021d5bf08c8ea15a927c" Mar 08 03:19:06.450777 master-0 kubenswrapper[13046]: E0308 03:19:06.450691 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-node-tuning-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-node-tuning-operator pod=cluster-node-tuning-operator-66c7586884-nnd8x_openshift-cluster-node-tuning-operator(bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c)\"" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" podUID="bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c" Mar 08 03:19:06.452557 master-0 kubenswrapper[13046]: I0308 03:19:06.452477 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/1.log" Mar 08 03:19:06.456151 master-0 kubenswrapper[13046]: I0308 03:19:06.455959 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/0.log" Mar 08 03:19:06.457064 master-0 kubenswrapper[13046]: I0308 03:19:06.456610 13046 generic.go:334] "Generic (PLEG): container finished" podID="5a2c9576-f7bd-4ac5-a7fe-530f26642f97" containerID="79f755e5e79953802136847a8f8f2fbf40c60f7a80b082c23d87184dffad232b" exitCode=255 Mar 08 03:19:06.457064 master-0 kubenswrapper[13046]: I0308 03:19:06.456690 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" event={"ID":"5a2c9576-f7bd-4ac5-a7fe-530f26642f97","Type":"ContainerDied","Data":"79f755e5e79953802136847a8f8f2fbf40c60f7a80b082c23d87184dffad232b"} Mar 08 03:19:06.458220 master-0 kubenswrapper[13046]: I0308 03:19:06.458145 13046 scope.go:117] "RemoveContainer" containerID="79f755e5e79953802136847a8f8f2fbf40c60f7a80b082c23d87184dffad232b" Mar 08 03:19:06.458676 master-0 kubenswrapper[13046]: E0308 03:19:06.458615 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"control-plane-machine-set-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=control-plane-machine-set-operator pod=control-plane-machine-set-operator-6686554ddc-gwnnd_openshift-machine-api(5a2c9576-f7bd-4ac5-a7fe-530f26642f97)\"" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" podUID="5a2c9576-f7bd-4ac5-a7fe-530f26642f97" Mar 08 03:19:06.474028 master-0 kubenswrapper[13046]: I0308 03:19:06.473896 13046 scope.go:117] "RemoveContainer" containerID="8a3da09cabdcb126428fcd447defcc99973fd5db3565d3792f66591da1ac8333" Mar 08 03:19:06.510234 master-0 kubenswrapper[13046]: I0308 03:19:06.510150 13046 scope.go:117] "RemoveContainer" containerID="27ab0f00e980c7d4d9fcf7e8c62f276ea49b975eb80fef82536adf6bfc74a796" Mar 08 03:19:06.748389 master-0 kubenswrapper[13046]: I0308 03:19:06.748254 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 08 03:19:06.762772 master-0 kubenswrapper[13046]: W0308 03:19:06.762698 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddbe1bc10_8da1_48fc_a9f0_089154ab30e3.slice/crio-34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d WatchSource:0}: Error finding container 34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d: Status 404 returned error can't find the container with id 34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d Mar 08 03:19:07.465906 master-0 kubenswrapper[13046]: I0308 03:19:07.465845 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/1.log" Mar 08 03:19:07.469233 master-0 kubenswrapper[13046]: I0308 03:19:07.469076 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/1.log" Mar 08 03:19:07.471962 master-0 kubenswrapper[13046]: I0308 03:19:07.471816 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/1.log" Mar 08 03:19:07.474419 master-0 kubenswrapper[13046]: I0308 03:19:07.474338 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"dbe1bc10-8da1-48fc-a9f0-089154ab30e3","Type":"ContainerStarted","Data":"652710c8bb185d833dd87daea2a7216e30c94a5c9fbe64033ef7d2bc88767b50"} Mar 08 03:19:07.474419 master-0 kubenswrapper[13046]: I0308 03:19:07.474387 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"dbe1bc10-8da1-48fc-a9f0-089154ab30e3","Type":"ContainerStarted","Data":"34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d"} Mar 08 03:19:07.495067 master-0 kubenswrapper[13046]: I0308 03:19:07.494955 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.494927922 podStartE2EDuration="2.494927922s" podCreationTimestamp="2026-03-08 03:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:19:07.491167887 +0000 UTC m=+349.569935134" watchObservedRunningTime="2026-03-08 03:19:07.494927922 +0000 UTC m=+349.573695169" Mar 08 03:19:08.800687 master-0 kubenswrapper[13046]: I0308 03:19:08.800609 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 08 03:19:08.801779 master-0 kubenswrapper[13046]: I0308 03:19:08.801733 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:08.803897 master-0 kubenswrapper[13046]: I0308 03:19:08.803843 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ks2rl" Mar 08 03:19:08.804392 master-0 kubenswrapper[13046]: I0308 03:19:08.804346 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:19:08.825057 master-0 kubenswrapper[13046]: I0308 03:19:08.824984 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 08 03:19:08.916796 master-0 kubenswrapper[13046]: I0308 03:19:08.916734 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:08.916796 master-0 kubenswrapper[13046]: I0308 03:19:08.916802 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:08.917577 master-0 kubenswrapper[13046]: I0308 03:19:08.916861 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.019152 master-0 kubenswrapper[13046]: I0308 03:19:09.019034 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.019152 master-0 kubenswrapper[13046]: I0308 03:19:09.019143 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.019427 master-0 kubenswrapper[13046]: I0308 03:19:09.019181 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.019427 master-0 kubenswrapper[13046]: I0308 03:19:09.019202 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.019427 master-0 kubenswrapper[13046]: I0308 03:19:09.019274 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.070933 master-0 kubenswrapper[13046]: I0308 03:19:09.070730 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.140133 master-0 kubenswrapper[13046]: I0308 03:19:09.140062 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:09.614915 master-0 kubenswrapper[13046]: I0308 03:19:09.614829 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 08 03:19:09.623502 master-0 kubenswrapper[13046]: W0308 03:19:09.623393 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6e141bdb_2d04_481b_8614_cc0d57ebc317.slice/crio-ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093 WatchSource:0}: Error finding container ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093: Status 404 returned error can't find the container with id ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093 Mar 08 03:19:10.031222 master-0 kubenswrapper[13046]: I0308 03:19:10.031124 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 08 03:19:10.032570 master-0 kubenswrapper[13046]: I0308 03:19:10.032521 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.036420 master-0 kubenswrapper[13046]: I0308 03:19:10.036324 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-gsmhw" Mar 08 03:19:10.037046 master-0 kubenswrapper[13046]: I0308 03:19:10.036973 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:19:10.050785 master-0 kubenswrapper[13046]: I0308 03:19:10.050711 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 08 03:19:10.146393 master-0 kubenswrapper[13046]: I0308 03:19:10.146201 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.146393 master-0 kubenswrapper[13046]: I0308 03:19:10.146273 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.146393 master-0 kubenswrapper[13046]: I0308 03:19:10.146311 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.249229 master-0 kubenswrapper[13046]: I0308 03:19:10.249146 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.249229 master-0 kubenswrapper[13046]: I0308 03:19:10.249234 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.249620 master-0 kubenswrapper[13046]: I0308 03:19:10.249560 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.249742 master-0 kubenswrapper[13046]: I0308 03:19:10.249670 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.249834 master-0 kubenswrapper[13046]: I0308 03:19:10.249772 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.278611 master-0 kubenswrapper[13046]: I0308 03:19:10.278533 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.382811 master-0 kubenswrapper[13046]: I0308 03:19:10.382733 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:10.510868 master-0 kubenswrapper[13046]: I0308 03:19:10.506137 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6e141bdb-2d04-481b-8614-cc0d57ebc317","Type":"ContainerStarted","Data":"31d362a1e45d1ce8106f91d9ec48b041bf6440fcfe5b5eead08ffe0ccf96e9e3"} Mar 08 03:19:10.510868 master-0 kubenswrapper[13046]: I0308 03:19:10.506221 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6e141bdb-2d04-481b-8614-cc0d57ebc317","Type":"ContainerStarted","Data":"ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093"} Mar 08 03:19:10.549195 master-0 kubenswrapper[13046]: I0308 03:19:10.549082 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" podStartSLOduration=2.549056669 podStartE2EDuration="2.549056669s" podCreationTimestamp="2026-03-08 03:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:19:10.544988448 +0000 UTC m=+352.623755695" watchObservedRunningTime="2026-03-08 03:19:10.549056669 +0000 UTC m=+352.627823926" Mar 08 03:19:10.891565 master-0 kubenswrapper[13046]: I0308 03:19:10.891470 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 08 03:19:10.902619 master-0 kubenswrapper[13046]: W0308 03:19:10.902575 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podae3dcaba_5e1d_42b0_b34b_e3789030f973.slice/crio-be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7 WatchSource:0}: Error finding container be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7: Status 404 returned error can't find the container with id be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7 Mar 08 03:19:11.118327 master-0 kubenswrapper[13046]: I0308 03:19:11.118248 13046 scope.go:117] "RemoveContainer" containerID="523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84" Mar 08 03:19:11.119242 master-0 kubenswrapper[13046]: E0308 03:19:11.118510 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-gwv4q_openshift-machine-api(5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" podUID="5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b" Mar 08 03:19:11.520740 master-0 kubenswrapper[13046]: I0308 03:19:11.520013 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"ae3dcaba-5e1d-42b0-b34b-e3789030f973","Type":"ContainerStarted","Data":"41448e83cf6bc50b2fdf0224b92a970ff2490922cefb26b542a49d95b1099653"} Mar 08 03:19:11.520740 master-0 kubenswrapper[13046]: I0308 03:19:11.520094 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"ae3dcaba-5e1d-42b0-b34b-e3789030f973","Type":"ContainerStarted","Data":"be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7"} Mar 08 03:19:11.551713 master-0 kubenswrapper[13046]: I0308 03:19:11.551610 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.5515823439999998 podStartE2EDuration="1.551582344s" podCreationTimestamp="2026-03-08 03:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:19:11.542614613 +0000 UTC m=+353.621381860" watchObservedRunningTime="2026-03-08 03:19:11.551582344 +0000 UTC m=+353.630349591" Mar 08 03:19:14.120716 master-0 kubenswrapper[13046]: I0308 03:19:14.120645 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:19:14.121350 master-0 kubenswrapper[13046]: E0308 03:19:14.121020 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"package-server-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=package-server-manager pod=package-server-manager-854648ff6d-2gxdj_openshift-operator-lifecycle-manager(2bbe9b81-0efb-4caa-bacd-55348cd392c6)\"" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" podUID="2bbe9b81-0efb-4caa-bacd-55348cd392c6" Mar 08 03:19:14.123423 master-0 kubenswrapper[13046]: I0308 03:19:14.123368 13046 scope.go:117] "RemoveContainer" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" Mar 08 03:19:14.123823 master-0 kubenswrapper[13046]: E0308 03:19:14.123763 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:19:17.125173 master-0 kubenswrapper[13046]: I0308 03:19:17.125095 13046 scope.go:117] "RemoveContainer" containerID="3eb78859836f3da919c6f295f9ebd383a5b6b693cee1d6fd99889820e2d9696c" Mar 08 03:19:17.573894 master-0 kubenswrapper[13046]: I0308 03:19:17.573742 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/1.log" Mar 08 03:19:17.574367 master-0 kubenswrapper[13046]: I0308 03:19:17.574261 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-r9m2k" event={"ID":"fd6b827c-70b0-47ed-b07c-c696343248a8","Type":"ContainerStarted","Data":"0f8501f117b038f8247a38aa45f9549defb832d0c820881f7e93a086a1ccabe3"} Mar 08 03:19:18.123663 master-0 kubenswrapper[13046]: I0308 03:19:18.123614 13046 scope.go:117] "RemoveContainer" containerID="82344cdd1de2ca9d97e9556224c6c294e5e614051e31021d5bf08c8ea15a927c" Mar 08 03:19:18.273600 master-0 kubenswrapper[13046]: I0308 03:19:18.273533 13046 scope.go:117] "RemoveContainer" containerID="d124b43f51c653f69e8c6d5ef2246cec81a9eefc70ec4a601600ce3b071a918e" Mar 08 03:19:18.594023 master-0 kubenswrapper[13046]: I0308 03:19:18.593944 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-nnd8x_bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c/cluster-node-tuning-operator/1.log" Mar 08 03:19:18.594256 master-0 kubenswrapper[13046]: I0308 03:19:18.594055 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-nnd8x" event={"ID":"bd6d3abf-d2df-4a6c-b7ab-40b78948ad0c","Type":"ContainerStarted","Data":"cd07f8c0e732a472564625a496b9e9689fb6a3ba2cb8a235d6afc298990d3024"} Mar 08 03:19:20.120872 master-0 kubenswrapper[13046]: I0308 03:19:20.120781 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:19:20.614820 master-0 kubenswrapper[13046]: I0308 03:19:20.614674 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7"} Mar 08 03:19:22.119145 master-0 kubenswrapper[13046]: I0308 03:19:22.119076 13046 scope.go:117] "RemoveContainer" containerID="79f755e5e79953802136847a8f8f2fbf40c60f7a80b082c23d87184dffad232b" Mar 08 03:19:22.637285 master-0 kubenswrapper[13046]: I0308 03:19:22.637198 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/1.log" Mar 08 03:19:22.637616 master-0 kubenswrapper[13046]: I0308 03:19:22.637301 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-gwnnd" event={"ID":"5a2c9576-f7bd-4ac5-a7fe-530f26642f97","Type":"ContainerStarted","Data":"dd2c2a5ccf46962841169ba465c1ec1a1afc1826ee53edab62249560bc02768b"} Mar 08 03:19:25.119351 master-0 kubenswrapper[13046]: I0308 03:19:25.119272 13046 scope.go:117] "RemoveContainer" containerID="408186a944f877462dc4c43e8f4761cc5e6bf1c86dc26f1a76d6146390289814" Mar 08 03:19:25.120195 master-0 kubenswrapper[13046]: I0308 03:19:25.119383 13046 scope.go:117] "RemoveContainer" containerID="523a4995d1091d85f6f4c5b24c00c4c59b3b9c15a7de5526ab7cdec00a907b84" Mar 08 03:19:25.661992 master-0 kubenswrapper[13046]: I0308 03:19:25.661846 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-2gxdj_2bbe9b81-0efb-4caa-bacd-55348cd392c6/package-server-manager/3.log" Mar 08 03:19:25.662351 master-0 kubenswrapper[13046]: I0308 03:19:25.662287 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" event={"ID":"2bbe9b81-0efb-4caa-bacd-55348cd392c6","Type":"ContainerStarted","Data":"96215efadff949bed63c972d40993111447eb8973692eae145fcd3d9d552d36b"} Mar 08 03:19:25.662763 master-0 kubenswrapper[13046]: I0308 03:19:25.662721 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:19:25.664969 master-0 kubenswrapper[13046]: I0308 03:19:25.664924 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/3.log" Mar 08 03:19:25.665570 master-0 kubenswrapper[13046]: I0308 03:19:25.665505 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-gwv4q" event={"ID":"5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b","Type":"ContainerStarted","Data":"c68089ea079bd87049b72adfe75150681e9142f7d1e5f05d5a8834e38f53e3b2"} Mar 08 03:19:27.118924 master-0 kubenswrapper[13046]: I0308 03:19:27.118841 13046 scope.go:117] "RemoveContainer" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" Mar 08 03:19:27.119784 master-0 kubenswrapper[13046]: E0308 03:19:27.119173 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-zd6kq_openshift-insights(b33ed2de-435b-4ccc-8dfd-29d52bf95ea8)\"" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" Mar 08 03:19:27.732940 master-0 kubenswrapper[13046]: I0308 03:19:27.732854 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 03:19:27.734164 master-0 kubenswrapper[13046]: I0308 03:19:27.734113 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.737720 master-0 kubenswrapper[13046]: I0308 03:19:27.737656 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-gcdsk" Mar 08 03:19:27.737783 master-0 kubenswrapper[13046]: I0308 03:19:27.737699 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:19:27.749010 master-0 kubenswrapper[13046]: I0308 03:19:27.748905 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 03:19:27.851469 master-0 kubenswrapper[13046]: I0308 03:19:27.851347 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.851469 master-0 kubenswrapper[13046]: I0308 03:19:27.851457 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.851832 master-0 kubenswrapper[13046]: I0308 03:19:27.851659 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.953317 master-0 kubenswrapper[13046]: I0308 03:19:27.953227 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.953638 master-0 kubenswrapper[13046]: I0308 03:19:27.953361 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.953638 master-0 kubenswrapper[13046]: I0308 03:19:27.953396 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.953638 master-0 kubenswrapper[13046]: I0308 03:19:27.953468 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.953831 master-0 kubenswrapper[13046]: I0308 03:19:27.953624 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:27.984970 master-0 kubenswrapper[13046]: I0308 03:19:27.984838 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:28.096430 master-0 kubenswrapper[13046]: I0308 03:19:28.096319 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:19:28.608435 master-0 kubenswrapper[13046]: I0308 03:19:28.608358 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 08 03:19:28.614070 master-0 kubenswrapper[13046]: W0308 03:19:28.613997 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1e34eb33_c8a4_4207_b78c_53e0cfc09e91.slice/crio-1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d WatchSource:0}: Error finding container 1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d: Status 404 returned error can't find the container with id 1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d Mar 08 03:19:28.695253 master-0 kubenswrapper[13046]: I0308 03:19:28.695196 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1e34eb33-c8a4-4207-b78c-53e0cfc09e91","Type":"ContainerStarted","Data":"1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d"} Mar 08 03:19:29.178214 master-0 kubenswrapper[13046]: I0308 03:19:29.178120 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:29.183692 master-0 kubenswrapper[13046]: I0308 03:19:29.183645 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:29.705795 master-0 kubenswrapper[13046]: I0308 03:19:29.705704 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1e34eb33-c8a4-4207-b78c-53e0cfc09e91","Type":"ContainerStarted","Data":"81690b2979176695e8530d71a981c729b1a1352c3403e63acd375ec96d2ffadd"} Mar 08 03:19:29.706717 master-0 kubenswrapper[13046]: I0308 03:19:29.705925 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:29.730312 master-0 kubenswrapper[13046]: I0308 03:19:29.730190 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.730157744 podStartE2EDuration="2.730157744s" podCreationTimestamp="2026-03-08 03:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:19:29.728296422 +0000 UTC m=+371.807063669" watchObservedRunningTime="2026-03-08 03:19:29.730157744 +0000 UTC m=+371.808925001" Mar 08 03:19:30.498163 master-0 kubenswrapper[13046]: I0308 03:19:30.498055 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:38.545785 master-0 kubenswrapper[13046]: I0308 03:19:38.545694 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:19:38.546513 master-0 kubenswrapper[13046]: I0308 03:19:38.545962 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044" gracePeriod=30 Mar 08 03:19:38.546513 master-0 kubenswrapper[13046]: I0308 03:19:38.546084 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1" gracePeriod=30 Mar 08 03:19:38.549399 master-0 kubenswrapper[13046]: I0308 03:19:38.549355 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:19:38.549684 master-0 kubenswrapper[13046]: E0308 03:19:38.549653 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:19:38.549684 master-0 kubenswrapper[13046]: I0308 03:19:38.549674 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:19:38.549815 master-0 kubenswrapper[13046]: E0308 03:19:38.549695 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:19:38.549815 master-0 kubenswrapper[13046]: I0308 03:19:38.549703 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:19:38.549940 master-0 kubenswrapper[13046]: I0308 03:19:38.549832 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 08 03:19:38.549940 master-0 kubenswrapper[13046]: I0308 03:19:38.549852 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 08 03:19:38.552010 master-0 kubenswrapper[13046]: I0308 03:19:38.551979 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.706750 master-0 kubenswrapper[13046]: I0308 03:19:38.706692 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.707114 master-0 kubenswrapper[13046]: I0308 03:19:38.707086 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.707297 master-0 kubenswrapper[13046]: I0308 03:19:38.707271 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.707465 master-0 kubenswrapper[13046]: I0308 03:19:38.707438 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.707681 master-0 kubenswrapper[13046]: I0308 03:19:38.707655 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.707869 master-0 kubenswrapper[13046]: I0308 03:19:38.707843 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809258 master-0 kubenswrapper[13046]: I0308 03:19:38.809096 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809258 master-0 kubenswrapper[13046]: I0308 03:19:38.809171 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809258 master-0 kubenswrapper[13046]: I0308 03:19:38.809210 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809680 master-0 kubenswrapper[13046]: I0308 03:19:38.809315 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809680 master-0 kubenswrapper[13046]: I0308 03:19:38.809533 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809888 master-0 kubenswrapper[13046]: I0308 03:19:38.809835 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.809978 master-0 kubenswrapper[13046]: I0308 03:19:38.809927 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.810244 master-0 kubenswrapper[13046]: I0308 03:19:38.810200 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.810574 master-0 kubenswrapper[13046]: I0308 03:19:38.810330 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.810907 master-0 kubenswrapper[13046]: I0308 03:19:38.810871 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.811152 master-0 kubenswrapper[13046]: I0308 03:19:38.810946 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:38.811301 master-0 kubenswrapper[13046]: I0308 03:19:38.810953 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 08 03:19:42.119136 master-0 kubenswrapper[13046]: I0308 03:19:42.119025 13046 scope.go:117] "RemoveContainer" containerID="6793e29fa45e989dedfbb6168fd8df0d147bdd307f2f1901888c8cee650797a6" Mar 08 03:19:42.810323 master-0 kubenswrapper[13046]: I0308 03:19:42.810274 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" event={"ID":"b33ed2de-435b-4ccc-8dfd-29d52bf95ea8","Type":"ContainerStarted","Data":"4cf7dbf7bba2bcbbf2ff91d9bc5bdbe6c23e8aa8ef607256223b43f717015548"} Mar 08 03:19:51.591428 master-0 kubenswrapper[13046]: E0308 03:19:51.591355 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:19:51.592278 master-0 kubenswrapper[13046]: I0308 03:19:51.592007 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 08 03:19:51.615569 master-0 kubenswrapper[13046]: W0308 03:19:51.615506 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-55670e64e6791bdf67e80049bebff4234f21f7e2789885894b29831d5dc075f6 WatchSource:0}: Error finding container 55670e64e6791bdf67e80049bebff4234f21f7e2789885894b29831d5dc075f6: Status 404 returned error can't find the container with id 55670e64e6791bdf67e80049bebff4234f21f7e2789885894b29831d5dc075f6 Mar 08 03:19:51.882037 master-0 kubenswrapper[13046]: I0308 03:19:51.881968 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"55670e64e6791bdf67e80049bebff4234f21f7e2789885894b29831d5dc075f6"} Mar 08 03:19:52.889654 master-0 kubenswrapper[13046]: I0308 03:19:52.889554 13046 generic.go:334] "Generic (PLEG): container finished" podID="dbe1bc10-8da1-48fc-a9f0-089154ab30e3" containerID="652710c8bb185d833dd87daea2a7216e30c94a5c9fbe64033ef7d2bc88767b50" exitCode=0 Mar 08 03:19:52.890643 master-0 kubenswrapper[13046]: I0308 03:19:52.889672 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"dbe1bc10-8da1-48fc-a9f0-089154ab30e3","Type":"ContainerDied","Data":"652710c8bb185d833dd87daea2a7216e30c94a5c9fbe64033ef7d2bc88767b50"} Mar 08 03:19:52.892833 master-0 kubenswrapper[13046]: I0308 03:19:52.892747 13046 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="9edf4a5cc547abe2fed4e9b92ac95fae0ce4ebe15ffb45f1ca8e654c801e30f4" exitCode=0 Mar 08 03:19:52.892833 master-0 kubenswrapper[13046]: I0308 03:19:52.892795 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"9edf4a5cc547abe2fed4e9b92ac95fae0ce4ebe15ffb45f1ca8e654c801e30f4"} Mar 08 03:19:53.904910 master-0 kubenswrapper[13046]: I0308 03:19:53.904807 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" exitCode=1 Mar 08 03:19:53.905797 master-0 kubenswrapper[13046]: I0308 03:19:53.904928 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7"} Mar 08 03:19:53.905797 master-0 kubenswrapper[13046]: I0308 03:19:53.905012 13046 scope.go:117] "RemoveContainer" containerID="9d5aebae5e452cd448361887c46f5d4e88aec8d69e154b3ec2c88206a321dd86" Mar 08 03:19:53.906693 master-0 kubenswrapper[13046]: I0308 03:19:53.905845 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:19:53.906693 master-0 kubenswrapper[13046]: E0308 03:19:53.906373 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:19:54.210960 master-0 kubenswrapper[13046]: I0308 03:19:54.210912 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:54.313672 master-0 kubenswrapper[13046]: I0308 03:19:54.313601 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access\") pod \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " Mar 08 03:19:54.313844 master-0 kubenswrapper[13046]: I0308 03:19:54.313800 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir\") pod \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " Mar 08 03:19:54.314011 master-0 kubenswrapper[13046]: I0308 03:19:54.313948 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbe1bc10-8da1-48fc-a9f0-089154ab30e3" (UID: "dbe1bc10-8da1-48fc-a9f0-089154ab30e3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:54.314069 master-0 kubenswrapper[13046]: I0308 03:19:54.313979 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock\") pod \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\" (UID: \"dbe1bc10-8da1-48fc-a9f0-089154ab30e3\") " Mar 08 03:19:54.314069 master-0 kubenswrapper[13046]: I0308 03:19:54.314037 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock" (OuterVolumeSpecName: "var-lock") pod "dbe1bc10-8da1-48fc-a9f0-089154ab30e3" (UID: "dbe1bc10-8da1-48fc-a9f0-089154ab30e3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:54.314725 master-0 kubenswrapper[13046]: I0308 03:19:54.314682 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:54.314725 master-0 kubenswrapper[13046]: I0308 03:19:54.314722 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:54.317247 master-0 kubenswrapper[13046]: I0308 03:19:54.317196 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbe1bc10-8da1-48fc-a9f0-089154ab30e3" (UID: "dbe1bc10-8da1-48fc-a9f0-089154ab30e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:19:54.416028 master-0 kubenswrapper[13046]: I0308 03:19:54.415974 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe1bc10-8da1-48fc-a9f0-089154ab30e3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:54.921391 master-0 kubenswrapper[13046]: I0308 03:19:54.921155 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597" exitCode=1 Mar 08 03:19:54.921391 master-0 kubenswrapper[13046]: I0308 03:19:54.921314 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597"} Mar 08 03:19:54.922378 master-0 kubenswrapper[13046]: I0308 03:19:54.921414 13046 scope.go:117] "RemoveContainer" containerID="0c2763066b9b93da23f0fac4ed741acd53596f68416bb8dfcb0cbbdd5cec3459" Mar 08 03:19:54.922378 master-0 kubenswrapper[13046]: I0308 03:19:54.922003 13046 scope.go:117] "RemoveContainer" containerID="2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597" Mar 08 03:19:54.922378 master-0 kubenswrapper[13046]: E0308 03:19:54.922314 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(a1a56802af72ce1aac6b5077f1695ac0)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" Mar 08 03:19:54.924346 master-0 kubenswrapper[13046]: I0308 03:19:54.924217 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"dbe1bc10-8da1-48fc-a9f0-089154ab30e3","Type":"ContainerDied","Data":"34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d"} Mar 08 03:19:54.924346 master-0 kubenswrapper[13046]: I0308 03:19:54.924259 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34f6c32ea7d6dc42aa98a275d3fd61a1e2b51b3169d9091f7ead891182d0165d" Mar 08 03:19:54.924346 master-0 kubenswrapper[13046]: I0308 03:19:54.924298 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 08 03:19:54.990423 master-0 kubenswrapper[13046]: E0308 03:19:54.990330 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:19:55.509448 master-0 kubenswrapper[13046]: I0308 03:19:55.509344 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:55.510447 master-0 kubenswrapper[13046]: I0308 03:19:55.510391 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:19:55.511949 master-0 kubenswrapper[13046]: E0308 03:19:55.511890 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:19:55.938762 master-0 kubenswrapper[13046]: I0308 03:19:55.938655 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-1-master-0_6e141bdb-2d04-481b-8614-cc0d57ebc317/installer/0.log" Mar 08 03:19:55.938762 master-0 kubenswrapper[13046]: I0308 03:19:55.938744 13046 generic.go:334] "Generic (PLEG): container finished" podID="6e141bdb-2d04-481b-8614-cc0d57ebc317" containerID="31d362a1e45d1ce8106f91d9ec48b041bf6440fcfe5b5eead08ffe0ccf96e9e3" exitCode=1 Mar 08 03:19:55.939821 master-0 kubenswrapper[13046]: I0308 03:19:55.938800 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6e141bdb-2d04-481b-8614-cc0d57ebc317","Type":"ContainerDied","Data":"31d362a1e45d1ce8106f91d9ec48b041bf6440fcfe5b5eead08ffe0ccf96e9e3"} Mar 08 03:19:56.948707 master-0 kubenswrapper[13046]: I0308 03:19:56.948635 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_ae3dcaba-5e1d-42b0-b34b-e3789030f973/installer/0.log" Mar 08 03:19:56.949645 master-0 kubenswrapper[13046]: I0308 03:19:56.948718 13046 generic.go:334] "Generic (PLEG): container finished" podID="ae3dcaba-5e1d-42b0-b34b-e3789030f973" containerID="41448e83cf6bc50b2fdf0224b92a970ff2490922cefb26b542a49d95b1099653" exitCode=1 Mar 08 03:19:56.949645 master-0 kubenswrapper[13046]: I0308 03:19:56.948803 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"ae3dcaba-5e1d-42b0-b34b-e3789030f973","Type":"ContainerDied","Data":"41448e83cf6bc50b2fdf0224b92a970ff2490922cefb26b542a49d95b1099653"} Mar 08 03:19:57.348660 master-0 kubenswrapper[13046]: I0308 03:19:57.348598 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-1-master-0_6e141bdb-2d04-481b-8614-cc0d57ebc317/installer/0.log" Mar 08 03:19:57.348844 master-0 kubenswrapper[13046]: I0308 03:19:57.348692 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:57.458298 master-0 kubenswrapper[13046]: I0308 03:19:57.458079 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir\") pod \"6e141bdb-2d04-481b-8614-cc0d57ebc317\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " Mar 08 03:19:57.458298 master-0 kubenswrapper[13046]: I0308 03:19:57.458210 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access\") pod \"6e141bdb-2d04-481b-8614-cc0d57ebc317\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " Mar 08 03:19:57.458298 master-0 kubenswrapper[13046]: I0308 03:19:57.458248 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock\") pod \"6e141bdb-2d04-481b-8614-cc0d57ebc317\" (UID: \"6e141bdb-2d04-481b-8614-cc0d57ebc317\") " Mar 08 03:19:57.458298 master-0 kubenswrapper[13046]: I0308 03:19:57.458245 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6e141bdb-2d04-481b-8614-cc0d57ebc317" (UID: "6e141bdb-2d04-481b-8614-cc0d57ebc317"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:57.458882 master-0 kubenswrapper[13046]: I0308 03:19:57.458390 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock" (OuterVolumeSpecName: "var-lock") pod "6e141bdb-2d04-481b-8614-cc0d57ebc317" (UID: "6e141bdb-2d04-481b-8614-cc0d57ebc317"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:57.458882 master-0 kubenswrapper[13046]: I0308 03:19:57.458804 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:57.458882 master-0 kubenswrapper[13046]: I0308 03:19:57.458827 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e141bdb-2d04-481b-8614-cc0d57ebc317-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:57.462674 master-0 kubenswrapper[13046]: I0308 03:19:57.462579 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6e141bdb-2d04-481b-8614-cc0d57ebc317" (UID: "6e141bdb-2d04-481b-8614-cc0d57ebc317"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:19:57.560542 master-0 kubenswrapper[13046]: I0308 03:19:57.560303 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e141bdb-2d04-481b-8614-cc0d57ebc317-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:57.961050 master-0 kubenswrapper[13046]: I0308 03:19:57.960959 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-1-master-0_6e141bdb-2d04-481b-8614-cc0d57ebc317/installer/0.log" Mar 08 03:19:57.961889 master-0 kubenswrapper[13046]: I0308 03:19:57.961177 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 08 03:19:57.961889 master-0 kubenswrapper[13046]: I0308 03:19:57.961210 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"6e141bdb-2d04-481b-8614-cc0d57ebc317","Type":"ContainerDied","Data":"ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093"} Mar 08 03:19:57.961889 master-0 kubenswrapper[13046]: I0308 03:19:57.961416 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca7691e97a6b25562e97d1d8410acbd8b78f4439eb613fa35dbeefa1ee81d093" Mar 08 03:19:58.374234 master-0 kubenswrapper[13046]: I0308 03:19:58.374166 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_ae3dcaba-5e1d-42b0-b34b-e3789030f973/installer/0.log" Mar 08 03:19:58.374538 master-0 kubenswrapper[13046]: I0308 03:19:58.374266 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:58.472630 master-0 kubenswrapper[13046]: I0308 03:19:58.472546 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access\") pod \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " Mar 08 03:19:58.473088 master-0 kubenswrapper[13046]: I0308 03:19:58.473044 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir\") pod \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " Mar 08 03:19:58.473419 master-0 kubenswrapper[13046]: I0308 03:19:58.473165 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ae3dcaba-5e1d-42b0-b34b-e3789030f973" (UID: "ae3dcaba-5e1d-42b0-b34b-e3789030f973"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:58.473715 master-0 kubenswrapper[13046]: I0308 03:19:58.473676 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock\") pod \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\" (UID: \"ae3dcaba-5e1d-42b0-b34b-e3789030f973\") " Mar 08 03:19:58.474034 master-0 kubenswrapper[13046]: I0308 03:19:58.473743 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock" (OuterVolumeSpecName: "var-lock") pod "ae3dcaba-5e1d-42b0-b34b-e3789030f973" (UID: "ae3dcaba-5e1d-42b0-b34b-e3789030f973"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:19:58.474358 master-0 kubenswrapper[13046]: I0308 03:19:58.474330 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:58.474542 master-0 kubenswrapper[13046]: I0308 03:19:58.474476 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:58.477153 master-0 kubenswrapper[13046]: I0308 03:19:58.477083 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ae3dcaba-5e1d-42b0-b34b-e3789030f973" (UID: "ae3dcaba-5e1d-42b0-b34b-e3789030f973"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:19:58.576398 master-0 kubenswrapper[13046]: I0308 03:19:58.576275 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae3dcaba-5e1d-42b0-b34b-e3789030f973-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:19:58.967668 master-0 kubenswrapper[13046]: I0308 03:19:58.967641 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_ae3dcaba-5e1d-42b0-b34b-e3789030f973/installer/0.log" Mar 08 03:19:58.968103 master-0 kubenswrapper[13046]: I0308 03:19:58.967690 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"ae3dcaba-5e1d-42b0-b34b-e3789030f973","Type":"ContainerDied","Data":"be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7"} Mar 08 03:19:58.968103 master-0 kubenswrapper[13046]: I0308 03:19:58.967714 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be3ff9a73b8b66d42c6b361633b72cbd7e4886918cde630551a752ea9f0d8ef7" Mar 08 03:19:58.968103 master-0 kubenswrapper[13046]: I0308 03:19:58.967760 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 08 03:19:59.177503 master-0 kubenswrapper[13046]: I0308 03:19:59.177382 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:19:59.178315 master-0 kubenswrapper[13046]: I0308 03:19:59.178264 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:19:59.178875 master-0 kubenswrapper[13046]: E0308 03:19:59.178811 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:20:00.490225 master-0 kubenswrapper[13046]: I0308 03:20:00.490118 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:20:00.491328 master-0 kubenswrapper[13046]: I0308 03:20:00.491119 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:20:00.491552 master-0 kubenswrapper[13046]: E0308 03:20:00.491478 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:20:03.684713 master-0 kubenswrapper[13046]: I0308 03:20:03.684608 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-2gxdj" Mar 08 03:20:04.991768 master-0 kubenswrapper[13046]: E0308 03:20:04.991673 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:20:05.902372 master-0 kubenswrapper[13046]: E0308 03:20:05.902182 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:20:06.031069 master-0 kubenswrapper[13046]: I0308 03:20:06.030984 13046 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1" exitCode=0 Mar 08 03:20:07.042398 master-0 kubenswrapper[13046]: I0308 03:20:07.042285 13046 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="ee9e0d0d09a586f2b381204377f48a7969c1d3611c01313a990fe888ae6b26f8" exitCode=0 Mar 08 03:20:07.042398 master-0 kubenswrapper[13046]: I0308 03:20:07.042365 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"ee9e0d0d09a586f2b381204377f48a7969c1d3611c01313a990fe888ae6b26f8"} Mar 08 03:20:08.681961 master-0 kubenswrapper[13046]: I0308 03:20:08.681890 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 03:20:08.683090 master-0 kubenswrapper[13046]: I0308 03:20:08.682002 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:20:08.832404 master-0 kubenswrapper[13046]: I0308 03:20:08.832210 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 03:20:08.832404 master-0 kubenswrapper[13046]: I0308 03:20:08.832370 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 08 03:20:08.832786 master-0 kubenswrapper[13046]: I0308 03:20:08.832749 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:20:08.833000 master-0 kubenswrapper[13046]: I0308 03:20:08.832936 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:20:08.833139 master-0 kubenswrapper[13046]: I0308 03:20:08.833089 13046 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:20:08.833139 master-0 kubenswrapper[13046]: I0308 03:20:08.833131 13046 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:20:09.060932 master-0 kubenswrapper[13046]: I0308 03:20:09.060863 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 08 03:20:09.060932 master-0 kubenswrapper[13046]: I0308 03:20:09.060922 13046 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044" exitCode=137 Mar 08 03:20:09.061413 master-0 kubenswrapper[13046]: I0308 03:20:09.060971 13046 scope.go:117] "RemoveContainer" containerID="d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1" Mar 08 03:20:09.061413 master-0 kubenswrapper[13046]: I0308 03:20:09.061046 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:20:09.079871 master-0 kubenswrapper[13046]: I0308 03:20:09.079818 13046 scope.go:117] "RemoveContainer" containerID="a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044" Mar 08 03:20:09.101913 master-0 kubenswrapper[13046]: I0308 03:20:09.101846 13046 scope.go:117] "RemoveContainer" containerID="d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1" Mar 08 03:20:09.102517 master-0 kubenswrapper[13046]: E0308 03:20:09.102422 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1\": container with ID starting with d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1 not found: ID does not exist" containerID="d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1" Mar 08 03:20:09.102610 master-0 kubenswrapper[13046]: I0308 03:20:09.102529 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1"} err="failed to get container status \"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1\": rpc error: code = NotFound desc = could not find container \"d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1\": container with ID starting with d6d27365e9ce8b1f87e79a948687d757d5252dd57e9f42f60f6f4075bd0b20a1 not found: ID does not exist" Mar 08 03:20:09.102610 master-0 kubenswrapper[13046]: I0308 03:20:09.102577 13046 scope.go:117] "RemoveContainer" containerID="a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044" Mar 08 03:20:09.103076 master-0 kubenswrapper[13046]: E0308 03:20:09.103014 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044\": container with ID starting with a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044 not found: ID does not exist" containerID="a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044" Mar 08 03:20:09.103154 master-0 kubenswrapper[13046]: I0308 03:20:09.103072 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044"} err="failed to get container status \"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044\": rpc error: code = NotFound desc = could not find container \"a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044\": container with ID starting with a786901e931a9cc4dc32846082c66e3a3e75b3dde8098d964698809e55940044 not found: ID does not exist" Mar 08 03:20:10.118320 master-0 kubenswrapper[13046]: I0308 03:20:10.118208 13046 scope.go:117] "RemoveContainer" containerID="2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597" Mar 08 03:20:10.135266 master-0 kubenswrapper[13046]: I0308 03:20:10.135183 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 08 03:20:10.135845 master-0 kubenswrapper[13046]: I0308 03:20:10.135797 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:20:12.560375 master-0 kubenswrapper[13046]: E0308 03:20:12.560182 13046 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189abf845d532b02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:19:38.546031362 +0000 UTC m=+380.624798619,LastTimestamp:2026-03-08 03:19:38.546031362 +0000 UTC m=+380.624798619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:20:14.113169 master-0 kubenswrapper[13046]: I0308 03:20:14.113076 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_1e34eb33-c8a4-4207-b78c-53e0cfc09e91/installer/0.log" Mar 08 03:20:14.113169 master-0 kubenswrapper[13046]: I0308 03:20:14.113193 13046 generic.go:334] "Generic (PLEG): container finished" podID="1e34eb33-c8a4-4207-b78c-53e0cfc09e91" containerID="81690b2979176695e8530d71a981c729b1a1352c3403e63acd375ec96d2ffadd" exitCode=1 Mar 08 03:20:14.993378 master-0 kubenswrapper[13046]: E0308 03:20:14.992993 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:20:20.051835 master-0 kubenswrapper[13046]: E0308 03:20:20.051715 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:20:21.169808 master-0 kubenswrapper[13046]: I0308 03:20:21.169704 13046 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="6a9030a5172fce4ebbd72e64483be0822029623403e551ee893bd3851fc9d988" exitCode=0 Mar 08 03:20:24.995008 master-0 kubenswrapper[13046]: E0308 03:20:24.994905 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:20:34.179025 master-0 kubenswrapper[13046]: E0308 03:20:34.178932 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 08 03:20:34.634636 master-0 kubenswrapper[13046]: I0308 03:20:34.634594 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:20:34.634636 master-0 kubenswrapper[13046]: I0308 03:20:34.634645 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:20:34.995593 master-0 kubenswrapper[13046]: E0308 03:20:34.995303 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:20:34.995593 master-0 kubenswrapper[13046]: I0308 03:20:34.995373 13046 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:20:38.325861 master-0 kubenswrapper[13046]: I0308 03:20:38.325807 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/1.log" Mar 08 03:20:38.336190 master-0 kubenswrapper[13046]: I0308 03:20:38.336115 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/0.log" Mar 08 03:20:38.337744 master-0 kubenswrapper[13046]: I0308 03:20:38.337679 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" containerID="97b7bdeb275d63cf51a95c68c58ee5ca2e124f9930c59d4a9b4c4dfb86ee0b8c" exitCode=1 Mar 08 03:20:42.830500 master-0 kubenswrapper[13046]: I0308 03:20:42.830397 13046 status_manager.go:851] "Failed to get status for pod" podUID="b33ed2de-435b-4ccc-8dfd-29d52bf95ea8" pod="openshift-insights/insights-operator-8f89dfddd-zd6kq" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods insights-operator-8f89dfddd-zd6kq)" Mar 08 03:20:44.138994 master-0 kubenswrapper[13046]: E0308 03:20:44.138885 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:20:44.139794 master-0 kubenswrapper[13046]: E0308 03:20:44.139161 13046 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.022s" Mar 08 03:20:44.143332 master-0 kubenswrapper[13046]: I0308 03:20:44.143273 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:20:44.143703 master-0 kubenswrapper[13046]: E0308 03:20:44.143630 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:20:44.143821 master-0 kubenswrapper[13046]: I0308 03:20:44.143788 13046 scope.go:117] "RemoveContainer" containerID="97b7bdeb275d63cf51a95c68c58ee5ca2e124f9930c59d4a9b4c4dfb86ee0b8c" Mar 08 03:20:44.144145 master-0 kubenswrapper[13046]: E0308 03:20:44.144093 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"approver\" with CrashLoopBackOff: \"back-off 10s restarting failed container=approver pod=network-node-identity-xjg74_openshift-network-node-identity(a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6)\"" pod="openshift-network-node-identity/network-node-identity-xjg74" podUID="a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6" Mar 08 03:20:44.151331 master-0 kubenswrapper[13046]: I0308 03:20:44.151258 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:20:44.995976 master-0 kubenswrapper[13046]: E0308 03:20:44.995871 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 08 03:20:46.563542 master-0 kubenswrapper[13046]: E0308 03:20:46.563279 13046 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{insights-operator-8f89dfddd-zd6kq.189abf6e705008ec openshift-insights 10188 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-insights,Name:insights-operator-8f89dfddd-zd6kq,UID:b33ed2de-435b-4ccc-8dfd-29d52bf95ea8,APIVersion:v1,ResourceVersion:9414,FieldPath:spec.containers{insights-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:18:04 +0000 UTC,LastTimestamp:2026-03-08 03:19:42.121665796 +0000 UTC m=+384.200433043,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:20:55.197531 master-0 kubenswrapper[13046]: E0308 03:20:55.197285 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 08 03:21:02.961209 master-0 kubenswrapper[13046]: E0308 03:21:02.961103 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:20:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:20:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:20:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:20:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:21:04.634354 master-0 kubenswrapper[13046]: I0308 03:21:04.634266 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:21:04.635207 master-0 kubenswrapper[13046]: I0308 03:21:04.634368 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:21:05.598658 master-0 kubenswrapper[13046]: E0308 03:21:05.598534 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 08 03:21:12.962622 master-0 kubenswrapper[13046]: E0308 03:21:12.962464 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:21:16.399725 master-0 kubenswrapper[13046]: E0308 03:21:16.399613 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 08 03:21:18.155470 master-0 kubenswrapper[13046]: E0308 03:21:18.155382 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 08 03:21:18.156544 master-0 kubenswrapper[13046]: E0308 03:21:18.155669 13046 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 08 03:21:18.156544 master-0 kubenswrapper[13046]: I0308 03:21:18.155802 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702"} Mar 08 03:21:18.170883 master-0 kubenswrapper[13046]: I0308 03:21:18.170802 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:21:18.174588 master-0 kubenswrapper[13046]: E0308 03:21:18.174443 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:21:18.176415 master-0 kubenswrapper[13046]: I0308 03:21:18.176348 13046 scope.go:117] "RemoveContainer" containerID="97b7bdeb275d63cf51a95c68c58ee5ca2e124f9930c59d4a9b4c4dfb86ee0b8c" Mar 08 03:21:18.182592 master-0 kubenswrapper[13046]: I0308 03:21:18.182477 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:21:18.599928 master-0 kubenswrapper[13046]: I0308 03:21:18.599874 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_1e34eb33-c8a4-4207-b78c-53e0cfc09e91/installer/0.log" Mar 08 03:21:18.600125 master-0 kubenswrapper[13046]: I0308 03:21:18.599972 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:21:18.793272 master-0 kubenswrapper[13046]: I0308 03:21:18.793125 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock\") pod \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " Mar 08 03:21:18.793272 master-0 kubenswrapper[13046]: I0308 03:21:18.793225 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir\") pod \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " Mar 08 03:21:18.793625 master-0 kubenswrapper[13046]: I0308 03:21:18.793470 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access\") pod \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\" (UID: \"1e34eb33-c8a4-4207-b78c-53e0cfc09e91\") " Mar 08 03:21:18.793713 master-0 kubenswrapper[13046]: I0308 03:21:18.793448 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1e34eb33-c8a4-4207-b78c-53e0cfc09e91" (UID: "1e34eb33-c8a4-4207-b78c-53e0cfc09e91"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:21:18.793843 master-0 kubenswrapper[13046]: I0308 03:21:18.793767 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock" (OuterVolumeSpecName: "var-lock") pod "1e34eb33-c8a4-4207-b78c-53e0cfc09e91" (UID: "1e34eb33-c8a4-4207-b78c-53e0cfc09e91"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:21:18.793843 master-0 kubenswrapper[13046]: I0308 03:21:18.793836 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:21:18.798092 master-0 kubenswrapper[13046]: I0308 03:21:18.798036 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1e34eb33-c8a4-4207-b78c-53e0cfc09e91" (UID: "1e34eb33-c8a4-4207-b78c-53e0cfc09e91"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:21:18.894864 master-0 kubenswrapper[13046]: I0308 03:21:18.894695 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:21:18.894864 master-0 kubenswrapper[13046]: I0308 03:21:18.894738 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e34eb33-c8a4-4207-b78c-53e0cfc09e91-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:21:19.222211 master-0 kubenswrapper[13046]: I0308 03:21:19.222151 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_1e34eb33-c8a4-4207-b78c-53e0cfc09e91/installer/0.log" Mar 08 03:21:19.223036 master-0 kubenswrapper[13046]: I0308 03:21:19.222372 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 08 03:21:19.227674 master-0 kubenswrapper[13046]: I0308 03:21:19.227580 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/1.log" Mar 08 03:21:19.228527 master-0 kubenswrapper[13046]: I0308 03:21:19.228450 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/0.log" Mar 08 03:21:20.566958 master-0 kubenswrapper[13046]: E0308 03:21:20.566755 13046 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{insights-operator-8f89dfddd-zd6kq.189abf6e7dc0372a openshift-insights 10197 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-insights,Name:insights-operator-8f89dfddd-zd6kq,UID:b33ed2de-435b-4ccc-8dfd-29d52bf95ea8,APIVersion:v1,ResourceVersion:9414,FieldPath:spec.containers{insights-operator},},Reason:Created,Message:Created container: insights-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:18:04 +0000 UTC,LastTimestamp:2026-03-08 03:19:42.319809132 +0000 UTC m=+384.398576379,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:21:25.243922 master-0 kubenswrapper[13046]: E0308 03:21:25.243840 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff" to get inode usage: stat /var/lib/containers/storage/overlay/6883c4d457e8e6721bc051f311e603087d24fd4c1e513d035cfd6a59f2c949b5/diff: no such file or directory, extraDiskErr: Mar 08 03:21:28.001407 master-0 kubenswrapper[13046]: E0308 03:21:28.001325 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 08 03:21:28.306517 master-0 kubenswrapper[13046]: I0308 03:21:28.306383 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/2.log" Mar 08 03:21:28.309942 master-0 kubenswrapper[13046]: I0308 03:21:28.309875 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/1.log" Mar 08 03:21:28.310111 master-0 kubenswrapper[13046]: I0308 03:21:28.309980 13046 generic.go:334] "Generic (PLEG): container finished" podID="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" containerID="470f17a045f36232ef4552e6c7cc64d2eb892c434c99999582d5fb1ef0f7249c" exitCode=1 Mar 08 03:21:29.318042 master-0 kubenswrapper[13046]: I0308 03:21:29.317980 13046 generic.go:334] "Generic (PLEG): container finished" podID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerID="1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244" exitCode=0 Mar 08 03:21:29.320648 master-0 kubenswrapper[13046]: I0308 03:21:29.320570 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-64bf9778cb-7hsbf_eedc7538-9cc6-4bf5-9628-e278310d796b/marketplace-operator/2.log" Mar 08 03:21:29.320720 master-0 kubenswrapper[13046]: I0308 03:21:29.320669 13046 generic.go:334] "Generic (PLEG): container finished" podID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerID="860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5" exitCode=0 Mar 08 03:21:33.673904 master-0 kubenswrapper[13046]: I0308 03:21:33.673822 13046 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-7hsbf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" start-of-body= Mar 08 03:21:33.674839 master-0 kubenswrapper[13046]: I0308 03:21:33.673915 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" Mar 08 03:21:33.675104 master-0 kubenswrapper[13046]: I0308 03:21:33.675050 13046 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-7hsbf container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" start-of-body= Mar 08 03:21:33.675217 master-0 kubenswrapper[13046]: I0308 03:21:33.675145 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.22:8080/healthz\": dial tcp 10.128.0.22:8080: connect: connection refused" Mar 08 03:21:34.029997 master-0 kubenswrapper[13046]: I0308 03:21:34.027977 13046 patch_prober.go:28] interesting pod/controller-manager-6494b94d74-kwkcq container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.70:8443/healthz\": dial tcp 10.128.0.70:8443: connect: connection refused" start-of-body= Mar 08 03:21:34.029997 master-0 kubenswrapper[13046]: I0308 03:21:34.028054 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.70:8443/healthz\": dial tcp 10.128.0.70:8443: connect: connection refused" Mar 08 03:21:34.029997 master-0 kubenswrapper[13046]: I0308 03:21:34.029536 13046 patch_prober.go:28] interesting pod/controller-manager-6494b94d74-kwkcq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.70:8443/healthz\": dial tcp 10.128.0.70:8443: connect: connection refused" start-of-body= Mar 08 03:21:34.029997 master-0 kubenswrapper[13046]: I0308 03:21:34.029630 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.70:8443/healthz\": dial tcp 10.128.0.70:8443: connect: connection refused" Mar 08 03:21:34.634357 master-0 kubenswrapper[13046]: I0308 03:21:34.634280 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:21:34.634605 master-0 kubenswrapper[13046]: I0308 03:21:34.634387 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:21:36.556444 master-0 kubenswrapper[13046]: E0308 03:21:36.556348 13046 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="18.4s" Mar 08 03:21:36.560167 master-0 kubenswrapper[13046]: I0308 03:21:36.559560 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:21:36.560167 master-0 kubenswrapper[13046]: E0308 03:21:36.559891 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:21:36.571573 master-0 kubenswrapper[13046]: I0308 03:21:36.571453 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 08 03:21:36.576659 master-0 kubenswrapper[13046]: I0308 03:21:36.576584 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576655 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1e34eb33-c8a4-4207-b78c-53e0cfc09e91","Type":"ContainerDied","Data":"81690b2979176695e8530d71a981c729b1a1352c3403e63acd375ec96d2ffadd"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576704 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"6a9030a5172fce4ebbd72e64483be0822029623403e551ee893bd3851fc9d988"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576735 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"e1b00a14469a47df3d1c6604e46056869c05736404240350aded7afc0849e26f"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576753 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"629a6a241a2d1d661b629f926dfc97913fe35e572a73fd9138171ed9094a810b"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576773 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"049b4df4a4fc66b3c0226a03b7c40ddde6ec0b3d4e06f8c6d05944a85e4087db"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576791 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"1ed1aec9b4ff351c3f27b466fd64b3fbcf5b2d8fb5246d44cfc7444b4a21e7eb"} Mar 08 03:21:36.576799 master-0 kubenswrapper[13046]: I0308 03:21:36.576812 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"a0e652b5d13c7a824da8985d4be46120ae0d587c80e6a7574d7e516579ff234f"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576832 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerDied","Data":"97b7bdeb275d63cf51a95c68c58ee5ca2e124f9930c59d4a9b4c4dfb86ee0b8c"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576862 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1e34eb33-c8a4-4207-b78c-53e0cfc09e91","Type":"ContainerDied","Data":"1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576884 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b05f5f5d5046101c0a6daa5ca65ac0027c737756a9e67f8f82bc73503b0e70d" Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576904 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xjg74" event={"ID":"a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6","Type":"ContainerStarted","Data":"08e1f4650b1977d738cce481cf9e9fd26a6f122f81b7cf69ddd938487b8420cc"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576932 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerDied","Data":"470f17a045f36232ef4552e6c7cc64d2eb892c434c99999582d5fb1ef0f7249c"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576959 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerDied","Data":"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.576984 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerDied","Data":"860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5"} Mar 08 03:21:36.577269 master-0 kubenswrapper[13046]: I0308 03:21:36.577140 13046 scope.go:117] "RemoveContainer" containerID="0f90c7e80ee619a77867feffa666b20dfa8fad2e9ecc5d700b999460ff6d737b" Mar 08 03:21:36.579016 master-0 kubenswrapper[13046]: I0308 03:21:36.577837 13046 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d"} pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 03:21:36.579016 master-0 kubenswrapper[13046]: I0308 03:21:36.577975 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" containerID="cri-o://519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d" gracePeriod=600 Mar 08 03:21:36.579016 master-0 kubenswrapper[13046]: I0308 03:21:36.578368 13046 scope.go:117] "RemoveContainer" containerID="470f17a045f36232ef4552e6c7cc64d2eb892c434c99999582d5fb1ef0f7249c" Mar 08 03:21:36.579016 master-0 kubenswrapper[13046]: E0308 03:21:36.578881 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-j6jpn_openshift-cluster-storage-operator(555ae3b4-71c6-4b62-9e09-66a58ae4c6ad)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" podUID="555ae3b4-71c6-4b62-9e09-66a58ae4c6ad" Mar 08 03:21:36.580073 master-0 kubenswrapper[13046]: I0308 03:21:36.579126 13046 scope.go:117] "RemoveContainer" containerID="860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5" Mar 08 03:21:36.580073 master-0 kubenswrapper[13046]: I0308 03:21:36.579435 13046 scope.go:117] "RemoveContainer" containerID="1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244" Mar 08 03:21:36.580317 master-0 kubenswrapper[13046]: E0308 03:21:36.579466 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:21:36.592562 master-0 kubenswrapper[13046]: I0308 03:21:36.592450 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 08 03:21:36.612036 master-0 kubenswrapper[13046]: I0308 03:21:36.611706 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:21:36.612036 master-0 kubenswrapper[13046]: I0308 03:21:36.611794 13046 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="0048fe1f-b794-48e6-8471-39e8c8ba52b4" Mar 08 03:21:36.626338 master-0 kubenswrapper[13046]: I0308 03:21:36.626233 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 08 03:21:36.626338 master-0 kubenswrapper[13046]: I0308 03:21:36.626282 13046 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="0048fe1f-b794-48e6-8471-39e8c8ba52b4" Mar 08 03:21:36.638112 master-0 kubenswrapper[13046]: I0308 03:21:36.637782 13046 scope.go:117] "RemoveContainer" containerID="af9687e18f6bdb247588a56f5a4a95f475d79f1ed57a7907b4b508a7261cfc09" Mar 08 03:21:36.643623 master-0 kubenswrapper[13046]: I0308 03:21:36.642033 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 08 03:21:36.740143 master-0 kubenswrapper[13046]: I0308 03:21:36.734949 13046 scope.go:117] "RemoveContainer" containerID="db29d09009f4759a0d9772e7362e06d305b3037cee895a8cc0418b2cf27421d4" Mar 08 03:21:36.953841 master-0 kubenswrapper[13046]: I0308 03:21:36.953756 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.953736328 podStartE2EDuration="953.736328ms" podCreationTimestamp="2026-03-08 03:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:21:36.950646783 +0000 UTC m=+499.029414010" watchObservedRunningTime="2026-03-08 03:21:36.953736328 +0000 UTC m=+499.032503545" Mar 08 03:21:37.388404 master-0 kubenswrapper[13046]: I0308 03:21:37.388366 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/2.log" Mar 08 03:21:37.390705 master-0 kubenswrapper[13046]: I0308 03:21:37.390667 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/cluster-cloud-controller-manager/0.log" Mar 08 03:21:37.390770 master-0 kubenswrapper[13046]: I0308 03:21:37.390723 13046 generic.go:334] "Generic (PLEG): container finished" podID="52836130-d42e-495c-adbf-19ff9a393347" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" exitCode=1 Mar 08 03:21:37.390806 master-0 kubenswrapper[13046]: I0308 03:21:37.390781 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b"} Mar 08 03:21:37.391286 master-0 kubenswrapper[13046]: I0308 03:21:37.391258 13046 scope.go:117] "RemoveContainer" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:21:37.395021 master-0 kubenswrapper[13046]: I0308 03:21:37.394990 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerStarted","Data":"b49e315423dd470e6b6da770027936f226df443789484303d1380c81a8337f72"} Mar 08 03:21:37.395985 master-0 kubenswrapper[13046]: I0308 03:21:37.395953 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:21:37.400011 master-0 kubenswrapper[13046]: I0308 03:21:37.399991 13046 generic.go:334] "Generic (PLEG): container finished" podID="1092f2a6-865c-4706-bba7-068621e85ebc" containerID="519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d" exitCode=0 Mar 08 03:21:37.400124 master-0 kubenswrapper[13046]: I0308 03:21:37.400039 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerDied","Data":"519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d"} Mar 08 03:21:37.400206 master-0 kubenswrapper[13046]: I0308 03:21:37.400192 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerStarted","Data":"9d49efaffbe78e0cb01e3cafa2637d00f74bc46a5f970b64c00fdc01f5a452ba"} Mar 08 03:21:37.400559 master-0 kubenswrapper[13046]: I0308 03:21:37.400525 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:21:37.402040 master-0 kubenswrapper[13046]: I0308 03:21:37.402027 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xjg74_a1ceb611-22e9-4a5e-b965-f4a6e2bfd3d6/approver/1.log" Mar 08 03:21:37.417465 master-0 kubenswrapper[13046]: E0308 03:21:37.417430 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 08 03:21:38.414346 master-0 kubenswrapper[13046]: I0308 03:21:38.414292 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/config-sync-controllers/0.log" Mar 08 03:21:38.415287 master-0 kubenswrapper[13046]: I0308 03:21:38.415262 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/cluster-cloud-controller-manager/0.log" Mar 08 03:21:38.415406 master-0 kubenswrapper[13046]: I0308 03:21:38.415387 13046 generic.go:334] "Generic (PLEG): container finished" podID="52836130-d42e-495c-adbf-19ff9a393347" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" exitCode=1 Mar 08 03:21:38.415587 master-0 kubenswrapper[13046]: I0308 03:21:38.415546 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d"} Mar 08 03:21:38.415644 master-0 kubenswrapper[13046]: I0308 03:21:38.415615 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f"} Mar 08 03:21:38.416387 master-0 kubenswrapper[13046]: I0308 03:21:38.416328 13046 scope.go:117] "RemoveContainer" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:21:39.430395 master-0 kubenswrapper[13046]: I0308 03:21:39.429973 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/config-sync-controllers/0.log" Mar 08 03:21:39.431401 master-0 kubenswrapper[13046]: I0308 03:21:39.430862 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/cluster-cloud-controller-manager/0.log" Mar 08 03:21:39.431401 master-0 kubenswrapper[13046]: I0308 03:21:39.430995 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerStarted","Data":"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6"} Mar 08 03:21:41.592965 master-0 kubenswrapper[13046]: I0308 03:21:41.592900 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 08 03:21:41.630898 master-0 kubenswrapper[13046]: I0308 03:21:41.630821 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 08 03:21:42.138051 master-0 kubenswrapper[13046]: I0308 03:21:42.137978 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: E0308 03:21:42.138241 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe1bc10-8da1-48fc-a9f0-089154ab30e3" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: I0308 03:21:42.138260 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe1bc10-8da1-48fc-a9f0-089154ab30e3" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: E0308 03:21:42.138278 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e34eb33-c8a4-4207-b78c-53e0cfc09e91" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: I0308 03:21:42.138285 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e34eb33-c8a4-4207-b78c-53e0cfc09e91" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: E0308 03:21:42.138310 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e141bdb-2d04-481b-8614-cc0d57ebc317" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: I0308 03:21:42.138319 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e141bdb-2d04-481b-8614-cc0d57ebc317" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: E0308 03:21:42.138330 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3dcaba-5e1d-42b0-b34b-e3789030f973" containerName="installer" Mar 08 03:21:42.138336 master-0 kubenswrapper[13046]: I0308 03:21:42.138338 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3dcaba-5e1d-42b0-b34b-e3789030f973" containerName="installer" Mar 08 03:21:42.138880 master-0 kubenswrapper[13046]: I0308 03:21:42.138453 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3dcaba-5e1d-42b0-b34b-e3789030f973" containerName="installer" Mar 08 03:21:42.138880 master-0 kubenswrapper[13046]: I0308 03:21:42.138506 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe1bc10-8da1-48fc-a9f0-089154ab30e3" containerName="installer" Mar 08 03:21:42.138880 master-0 kubenswrapper[13046]: I0308 03:21:42.138521 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e34eb33-c8a4-4207-b78c-53e0cfc09e91" containerName="installer" Mar 08 03:21:42.138880 master-0 kubenswrapper[13046]: I0308 03:21:42.138544 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e141bdb-2d04-481b-8614-cc0d57ebc317" containerName="installer" Mar 08 03:21:42.139111 master-0 kubenswrapper[13046]: I0308 03:21:42.138966 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.144559 master-0 kubenswrapper[13046]: I0308 03:21:42.144508 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:21:42.144779 master-0 kubenswrapper[13046]: I0308 03:21:42.144664 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-gcdsk" Mar 08 03:21:42.163018 master-0 kubenswrapper[13046]: I0308 03:21:42.162946 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:21:42.264419 master-0 kubenswrapper[13046]: I0308 03:21:42.264362 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.264669 master-0 kubenswrapper[13046]: I0308 03:21:42.264438 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.264669 master-0 kubenswrapper[13046]: I0308 03:21:42.264523 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.366970 master-0 kubenswrapper[13046]: I0308 03:21:42.366885 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.366970 master-0 kubenswrapper[13046]: I0308 03:21:42.366976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.367320 master-0 kubenswrapper[13046]: I0308 03:21:42.367024 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.367320 master-0 kubenswrapper[13046]: I0308 03:21:42.367132 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.367320 master-0 kubenswrapper[13046]: I0308 03:21:42.367211 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.399356 master-0 kubenswrapper[13046]: I0308 03:21:42.399224 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.470705 master-0 kubenswrapper[13046]: I0308 03:21:42.470609 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:21:42.475514 master-0 kubenswrapper[13046]: I0308 03:21:42.475412 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 08 03:21:42.941177 master-0 kubenswrapper[13046]: I0308 03:21:42.941066 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 08 03:21:42.948863 master-0 kubenswrapper[13046]: W0308 03:21:42.948824 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3035f9a_535f_4d1a_b3a0_02e2511894ff.slice/crio-843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad WatchSource:0}: Error finding container 843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad: Status 404 returned error can't find the container with id 843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad Mar 08 03:21:43.463879 master-0 kubenswrapper[13046]: I0308 03:21:43.463675 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d3035f9a-535f-4d1a-b3a0-02e2511894ff","Type":"ContainerStarted","Data":"d621d2f82abc5387792939e7e66660c43e46a86bde107e627d0911e7d0c6b86c"} Mar 08 03:21:43.463879 master-0 kubenswrapper[13046]: I0308 03:21:43.463756 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d3035f9a-535f-4d1a-b3a0-02e2511894ff","Type":"ContainerStarted","Data":"843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad"} Mar 08 03:21:43.497368 master-0 kubenswrapper[13046]: I0308 03:21:43.497286 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=1.497260534 podStartE2EDuration="1.497260534s" podCreationTimestamp="2026-03-08 03:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:21:43.493393687 +0000 UTC m=+505.572160954" watchObservedRunningTime="2026-03-08 03:21:43.497260534 +0000 UTC m=+505.576027791" Mar 08 03:21:43.674263 master-0 kubenswrapper[13046]: I0308 03:21:43.674187 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:21:43.674825 master-0 kubenswrapper[13046]: I0308 03:21:43.674277 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:21:43.675039 master-0 kubenswrapper[13046]: I0308 03:21:43.675005 13046 scope.go:117] "RemoveContainer" containerID="860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5" Mar 08 03:21:43.675360 master-0 kubenswrapper[13046]: E0308 03:21:43.675313 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:21:49.118698 master-0 kubenswrapper[13046]: I0308 03:21:49.118619 13046 scope.go:117] "RemoveContainer" containerID="470f17a045f36232ef4552e6c7cc64d2eb892c434c99999582d5fb1ef0f7249c" Mar 08 03:21:49.513974 master-0 kubenswrapper[13046]: I0308 03:21:49.513811 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/2.log" Mar 08 03:21:49.513974 master-0 kubenswrapper[13046]: I0308 03:21:49.513870 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-j6jpn" event={"ID":"555ae3b4-71c6-4b62-9e09-66a58ae4c6ad","Type":"ContainerStarted","Data":"ba591bc14bfa4220bb36d2dfb6c12af49eb8d4db68ffef27e517469b56217a63"} Mar 08 03:21:51.118976 master-0 kubenswrapper[13046]: I0308 03:21:51.118910 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:21:51.119659 master-0 kubenswrapper[13046]: E0308 03:21:51.119396 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:21:55.119133 master-0 kubenswrapper[13046]: I0308 03:21:55.119054 13046 scope.go:117] "RemoveContainer" containerID="860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5" Mar 08 03:21:55.120121 master-0 kubenswrapper[13046]: E0308 03:21:55.119408 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=marketplace-operator pod=marketplace-operator-64bf9778cb-7hsbf_openshift-marketplace(eedc7538-9cc6-4bf5-9628-e278310d796b)\"" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" podUID="eedc7538-9cc6-4bf5-9628-e278310d796b" Mar 08 03:21:59.526239 master-0 kubenswrapper[13046]: I0308 03:21:59.526103 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-2-master-0"] Mar 08 03:21:59.529934 master-0 kubenswrapper[13046]: I0308 03:21:59.527379 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.530428 master-0 kubenswrapper[13046]: I0308 03:21:59.530352 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ks2rl" Mar 08 03:21:59.530428 master-0 kubenswrapper[13046]: I0308 03:21:59.530411 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:21:59.532837 master-0 kubenswrapper[13046]: I0308 03:21:59.532783 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-2-master-0"] Mar 08 03:21:59.536613 master-0 kubenswrapper[13046]: I0308 03:21:59.534307 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.540575 master-0 kubenswrapper[13046]: I0308 03:21:59.540475 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-2-master-0"] Mar 08 03:21:59.579717 master-0 kubenswrapper[13046]: I0308 03:21:59.579665 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-gsmhw" Mar 08 03:21:59.579972 master-0 kubenswrapper[13046]: I0308 03:21:59.579935 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:21:59.606353 master-0 kubenswrapper[13046]: I0308 03:21:59.601223 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-2-master-0"] Mar 08 03:21:59.615446 master-0 kubenswrapper[13046]: I0308 03:21:59.615398 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.615732 master-0 kubenswrapper[13046]: I0308 03:21:59.615713 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.615873 master-0 kubenswrapper[13046]: I0308 03:21:59.615847 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.616027 master-0 kubenswrapper[13046]: I0308 03:21:59.616009 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.616206 master-0 kubenswrapper[13046]: I0308 03:21:59.616182 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.616329 master-0 kubenswrapper[13046]: I0308 03:21:59.616312 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.718156 master-0 kubenswrapper[13046]: I0308 03:21:59.718117 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.718591 master-0 kubenswrapper[13046]: I0308 03:21:59.718558 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719428 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719538 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719538 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719610 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719615 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719647 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719724 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.719834 master-0 kubenswrapper[13046]: I0308 03:21:59.719809 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.736219 master-0 kubenswrapper[13046]: I0308 03:21:59.736140 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.744910 master-0 kubenswrapper[13046]: I0308 03:21:59.744832 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access\") pod \"installer-4-retry-2-master-0\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:21:59.909417 master-0 kubenswrapper[13046]: I0308 03:21:59.909341 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:21:59.935588 master-0 kubenswrapper[13046]: I0308 03:21:59.935549 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:22:00.376919 master-0 kubenswrapper[13046]: I0308 03:22:00.376852 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-2-master-0"] Mar 08 03:22:00.382599 master-0 kubenswrapper[13046]: W0308 03:22:00.382432 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8a4e339f_ab04_4900_bf34_e683a2ed0eff.slice/crio-1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed WatchSource:0}: Error finding container 1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed: Status 404 returned error can't find the container with id 1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed Mar 08 03:22:00.488294 master-0 kubenswrapper[13046]: I0308 03:22:00.488239 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-2-master-0"] Mar 08 03:22:00.600958 master-0 kubenswrapper[13046]: I0308 03:22:00.600919 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" event={"ID":"8a4e339f-ab04-4900-bf34-e683a2ed0eff","Type":"ContainerStarted","Data":"1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed"} Mar 08 03:22:00.602894 master-0 kubenswrapper[13046]: I0308 03:22:00.602544 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" event={"ID":"b9eb9745-3670-4b36-86fa-9da2aad5a9d4","Type":"ContainerStarted","Data":"286fa2c5bbdb352c300271a312061ff951294c02b169409dc01a75b23eeb68e1"} Mar 08 03:22:01.615180 master-0 kubenswrapper[13046]: I0308 03:22:01.615092 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" event={"ID":"8a4e339f-ab04-4900-bf34-e683a2ed0eff","Type":"ContainerStarted","Data":"b782441078f2f3eb173e5b4652d1cc7729790c4a9e322427c29ee670681b45e2"} Mar 08 03:22:01.617911 master-0 kubenswrapper[13046]: I0308 03:22:01.617858 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" event={"ID":"b9eb9745-3670-4b36-86fa-9da2aad5a9d4","Type":"ContainerStarted","Data":"c2ba556f3be879782ef9b91d6f546eeb856368d1d454f2b1ac52d44228e79b29"} Mar 08 03:22:01.688088 master-0 kubenswrapper[13046]: I0308 03:22:01.688004 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" podStartSLOduration=2.687980516 podStartE2EDuration="2.687980516s" podCreationTimestamp="2026-03-08 03:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:22:01.650804532 +0000 UTC m=+523.729571779" watchObservedRunningTime="2026-03-08 03:22:01.687980516 +0000 UTC m=+523.766747763" Mar 08 03:22:01.689303 master-0 kubenswrapper[13046]: I0308 03:22:01.689258 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" podStartSLOduration=2.689245771 podStartE2EDuration="2.689245771s" podCreationTimestamp="2026-03-08 03:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:22:01.68666871 +0000 UTC m=+523.765435967" watchObservedRunningTime="2026-03-08 03:22:01.689245771 +0000 UTC m=+523.768013028" Mar 08 03:22:04.118500 master-0 kubenswrapper[13046]: I0308 03:22:04.118411 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:22:04.119093 master-0 kubenswrapper[13046]: E0308 03:22:04.118851 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:22:10.118580 master-0 kubenswrapper[13046]: I0308 03:22:10.118471 13046 scope.go:117] "RemoveContainer" containerID="860b9f9ba562c928fcdd286d234e1fea2b89e768618478368682f81163d4acf5" Mar 08 03:22:10.686420 master-0 kubenswrapper[13046]: I0308 03:22:10.686315 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" event={"ID":"eedc7538-9cc6-4bf5-9628-e278310d796b","Type":"ContainerStarted","Data":"231bbfe1b6a1bdc13d1af0dd49921bb61ad8a12fa6b548397aad49ec5e20d6dc"} Mar 08 03:22:10.688549 master-0 kubenswrapper[13046]: I0308 03:22:10.688474 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:22:10.691735 master-0 kubenswrapper[13046]: I0308 03:22:10.691682 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-7hsbf" Mar 08 03:22:18.711866 master-0 kubenswrapper[13046]: E0308 03:22:18.711777 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff: no such file or directory, extraDiskErr: Mar 08 03:22:19.118391 master-0 kubenswrapper[13046]: I0308 03:22:19.118314 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:22:19.118766 master-0 kubenswrapper[13046]: E0308 03:22:19.118714 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:22:21.457519 master-0 kubenswrapper[13046]: I0308 03:22:21.457297 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:22:21.458224 master-0 kubenswrapper[13046]: I0308 03:22:21.458160 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.458537 master-0 kubenswrapper[13046]: I0308 03:22:21.458434 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:22:21.459066 master-0 kubenswrapper[13046]: I0308 03:22:21.458986 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://9b42a4f3cd06b596c0776aff41c17ae083724aac3b4bd87b457ee3501b6408f8" gracePeriod=15 Mar 08 03:22:21.459066 master-0 kubenswrapper[13046]: I0308 03:22:21.459018 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8267a2d84f723ff694dc31049976c14f450972f451107ec8a3714b4067dbd5aa" gracePeriod=15 Mar 08 03:22:21.459315 master-0 kubenswrapper[13046]: I0308 03:22:21.459088 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ceb0c94c04f56c2553f651c28f375c02ae1b955b20e010419230a4a5aff01519" gracePeriod=15 Mar 08 03:22:21.459315 master-0 kubenswrapper[13046]: I0308 03:22:21.459087 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://352601263b2ea037568f79eb419fdd95756531630d85b95824eedb3557887aab" gracePeriod=15 Mar 08 03:22:21.459540 master-0 kubenswrapper[13046]: I0308 03:22:21.459359 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://72517ac9670d34df16c03c6560b187788f7f0baf22e95a4ce45b7d58900f22fc" gracePeriod=15 Mar 08 03:22:21.462350 master-0 kubenswrapper[13046]: I0308 03:22:21.462182 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:22:21.467808 master-0 kubenswrapper[13046]: E0308 03:22:21.467762 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:22:21.468059 master-0 kubenswrapper[13046]: I0308 03:22:21.468026 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:22:21.468277 master-0 kubenswrapper[13046]: E0308 03:22:21.468244 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:22:21.468516 master-0 kubenswrapper[13046]: I0308 03:22:21.468452 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:22:21.468745 master-0 kubenswrapper[13046]: E0308 03:22:21.468710 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:22:21.468937 master-0 kubenswrapper[13046]: I0308 03:22:21.468906 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:22:21.472400 master-0 kubenswrapper[13046]: E0308 03:22:21.472359 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:22:21.472674 master-0 kubenswrapper[13046]: I0308 03:22:21.472639 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:22:21.473781 master-0 kubenswrapper[13046]: E0308 03:22:21.473746 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 03:22:21.474069 master-0 kubenswrapper[13046]: I0308 03:22:21.474050 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 08 03:22:21.474190 master-0 kubenswrapper[13046]: E0308 03:22:21.474171 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:22:21.474298 master-0 kubenswrapper[13046]: I0308 03:22:21.474280 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:22:21.475312 master-0 kubenswrapper[13046]: I0308 03:22:21.475286 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 08 03:22:21.475441 master-0 kubenswrapper[13046]: I0308 03:22:21.475420 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:22:21.475586 master-0 kubenswrapper[13046]: I0308 03:22:21.475565 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 08 03:22:21.475721 master-0 kubenswrapper[13046]: I0308 03:22:21.475702 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 08 03:22:21.475856 master-0 kubenswrapper[13046]: I0308 03:22:21.475839 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 08 03:22:21.520352 master-0 kubenswrapper[13046]: I0308 03:22:21.520298 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.520352 master-0 kubenswrapper[13046]: I0308 03:22:21.520340 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520372 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520584 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520664 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520765 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520792 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.521320 master-0 kubenswrapper[13046]: I0308 03:22:21.520850 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.583862 master-0 kubenswrapper[13046]: E0308 03:22:21.583808 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622112 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622189 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622259 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622261 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622292 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622330 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.622372 master-0 kubenswrapper[13046]: I0308 03:22:21.622355 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622851 master-0 kubenswrapper[13046]: I0308 03:22:21.622391 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.622851 master-0 kubenswrapper[13046]: I0308 03:22:21.622579 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622851 master-0 kubenswrapper[13046]: I0308 03:22:21.622660 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.622851 master-0 kubenswrapper[13046]: I0308 03:22:21.622797 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.622851 master-0 kubenswrapper[13046]: I0308 03:22:21.622822 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.623045 master-0 kubenswrapper[13046]: I0308 03:22:21.622927 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.623045 master-0 kubenswrapper[13046]: I0308 03:22:21.622963 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.623045 master-0 kubenswrapper[13046]: I0308 03:22:21.622993 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.623045 master-0 kubenswrapper[13046]: I0308 03:22:21.623023 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:21.774839 master-0 kubenswrapper[13046]: I0308 03:22:21.774660 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:22:21.776169 master-0 kubenswrapper[13046]: I0308 03:22:21.776115 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="ceb0c94c04f56c2553f651c28f375c02ae1b955b20e010419230a4a5aff01519" exitCode=0 Mar 08 03:22:21.776169 master-0 kubenswrapper[13046]: I0308 03:22:21.776158 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="8267a2d84f723ff694dc31049976c14f450972f451107ec8a3714b4067dbd5aa" exitCode=0 Mar 08 03:22:21.776403 master-0 kubenswrapper[13046]: I0308 03:22:21.776173 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="72517ac9670d34df16c03c6560b187788f7f0baf22e95a4ce45b7d58900f22fc" exitCode=0 Mar 08 03:22:21.776403 master-0 kubenswrapper[13046]: I0308 03:22:21.776190 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="352601263b2ea037568f79eb419fdd95756531630d85b95824eedb3557887aab" exitCode=2 Mar 08 03:22:21.778424 master-0 kubenswrapper[13046]: I0308 03:22:21.778359 13046 generic.go:334] "Generic (PLEG): container finished" podID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" containerID="d621d2f82abc5387792939e7e66660c43e46a86bde107e627d0911e7d0c6b86c" exitCode=0 Mar 08 03:22:21.778424 master-0 kubenswrapper[13046]: I0308 03:22:21.778411 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d3035f9a-535f-4d1a-b3a0-02e2511894ff","Type":"ContainerDied","Data":"d621d2f82abc5387792939e7e66660c43e46a86bde107e627d0911e7d0c6b86c"} Mar 08 03:22:21.780651 master-0 kubenswrapper[13046]: I0308 03:22:21.780307 13046 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:21.781550 master-0 kubenswrapper[13046]: I0308 03:22:21.781434 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:21.885473 master-0 kubenswrapper[13046]: I0308 03:22:21.885376 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:21.923656 master-0 kubenswrapper[13046]: W0308 03:22:21.923579 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899242a15b2bdf3b4a04fb323647ca94.slice/crio-dc6fbda849df780a3dc5937b963416f7435e04add8a4699a60dd1d22f8f6bf35 WatchSource:0}: Error finding container dc6fbda849df780a3dc5937b963416f7435e04add8a4699a60dd1d22f8f6bf35: Status 404 returned error can't find the container with id dc6fbda849df780a3dc5937b963416f7435e04add8a4699a60dd1d22f8f6bf35 Mar 08 03:22:21.928448 master-0 kubenswrapper[13046]: E0308 03:22:21.928281 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189abfaa679559b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:899242a15b2bdf3b4a04fb323647ca94,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,LastTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:22:22.139469 master-0 kubenswrapper[13046]: E0308 03:22:22.139369 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.140407 master-0 kubenswrapper[13046]: E0308 03:22:22.140328 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.141319 master-0 kubenswrapper[13046]: E0308 03:22:22.141236 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.142775 master-0 kubenswrapper[13046]: E0308 03:22:22.142709 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.143649 master-0 kubenswrapper[13046]: E0308 03:22:22.143584 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.143649 master-0 kubenswrapper[13046]: I0308 03:22:22.143636 13046 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:22:22.144624 master-0 kubenswrapper[13046]: E0308 03:22:22.144543 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:22:22.346608 master-0 kubenswrapper[13046]: E0308 03:22:22.346515 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:22:22.748133 master-0 kubenswrapper[13046]: E0308 03:22:22.748064 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:22:22.786675 master-0 kubenswrapper[13046]: I0308 03:22:22.786611 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0"} Mar 08 03:22:22.786871 master-0 kubenswrapper[13046]: I0308 03:22:22.786688 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"dc6fbda849df780a3dc5937b963416f7435e04add8a4699a60dd1d22f8f6bf35"} Mar 08 03:22:22.788005 master-0 kubenswrapper[13046]: I0308 03:22:22.787850 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:22.788005 master-0 kubenswrapper[13046]: E0308 03:22:22.787855 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:22:23.210157 master-0 kubenswrapper[13046]: I0308 03:22:23.210077 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:22:23.211166 master-0 kubenswrapper[13046]: I0308 03:22:23.211081 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:23.244692 master-0 kubenswrapper[13046]: I0308 03:22:23.244602 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock\") pod \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " Mar 08 03:22:23.244890 master-0 kubenswrapper[13046]: I0308 03:22:23.244720 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock" (OuterVolumeSpecName: "var-lock") pod "d3035f9a-535f-4d1a-b3a0-02e2511894ff" (UID: "d3035f9a-535f-4d1a-b3a0-02e2511894ff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:23.244890 master-0 kubenswrapper[13046]: I0308 03:22:23.244765 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access\") pod \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " Mar 08 03:22:23.244890 master-0 kubenswrapper[13046]: I0308 03:22:23.244861 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir\") pod \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\" (UID: \"d3035f9a-535f-4d1a-b3a0-02e2511894ff\") " Mar 08 03:22:23.245187 master-0 kubenswrapper[13046]: I0308 03:22:23.245095 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3035f9a-535f-4d1a-b3a0-02e2511894ff" (UID: "d3035f9a-535f-4d1a-b3a0-02e2511894ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:23.245288 master-0 kubenswrapper[13046]: I0308 03:22:23.245202 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:23.249389 master-0 kubenswrapper[13046]: I0308 03:22:23.249324 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3035f9a-535f-4d1a-b3a0-02e2511894ff" (UID: "d3035f9a-535f-4d1a-b3a0-02e2511894ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:22:23.346657 master-0 kubenswrapper[13046]: I0308 03:22:23.346581 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:23.346657 master-0 kubenswrapper[13046]: I0308 03:22:23.346639 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3035f9a-535f-4d1a-b3a0-02e2511894ff-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:23.569334 master-0 kubenswrapper[13046]: E0308 03:22:23.569080 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:22:23.796656 master-0 kubenswrapper[13046]: I0308 03:22:23.796584 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d3035f9a-535f-4d1a-b3a0-02e2511894ff","Type":"ContainerDied","Data":"843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad"} Mar 08 03:22:23.796656 master-0 kubenswrapper[13046]: I0308 03:22:23.796622 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 08 03:22:23.796656 master-0 kubenswrapper[13046]: I0308 03:22:23.796655 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="843011d22a9efa2481f47e7bc77fd2a5c2a55de0a67168e32106b70a1ab1c8ad" Mar 08 03:22:23.802735 master-0 kubenswrapper[13046]: I0308 03:22:23.802673 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:22:23.804104 master-0 kubenswrapper[13046]: I0308 03:22:23.804035 13046 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="9b42a4f3cd06b596c0776aff41c17ae083724aac3b4bd87b457ee3501b6408f8" exitCode=0 Mar 08 03:22:23.822962 master-0 kubenswrapper[13046]: I0308 03:22:23.822855 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:22:23.824060 master-0 kubenswrapper[13046]: I0308 03:22:23.824014 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:23.825162 master-0 kubenswrapper[13046]: I0308 03:22:23.825104 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:23.825866 master-0 kubenswrapper[13046]: I0308 03:22:23.825808 13046 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:23.826824 master-0 kubenswrapper[13046]: I0308 03:22:23.826771 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:23.827305 master-0 kubenswrapper[13046]: I0308 03:22:23.827255 13046 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:23.858785 master-0 kubenswrapper[13046]: I0308 03:22:23.858715 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:22:23.858920 master-0 kubenswrapper[13046]: I0308 03:22:23.858848 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:22:23.858991 master-0 kubenswrapper[13046]: I0308 03:22:23.858958 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 08 03:22:23.859082 master-0 kubenswrapper[13046]: I0308 03:22:23.858832 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:23.859082 master-0 kubenswrapper[13046]: I0308 03:22:23.859015 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:23.859082 master-0 kubenswrapper[13046]: I0308 03:22:23.858879 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:23.859277 master-0 kubenswrapper[13046]: I0308 03:22:23.859245 13046 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:23.859277 master-0 kubenswrapper[13046]: I0308 03:22:23.859265 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:23.859277 master-0 kubenswrapper[13046]: I0308 03:22:23.859279 13046 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:24.129176 master-0 kubenswrapper[13046]: I0308 03:22:24.129107 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 08 03:22:24.817286 master-0 kubenswrapper[13046]: I0308 03:22:24.817208 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 08 03:22:24.818541 master-0 kubenswrapper[13046]: I0308 03:22:24.818433 13046 scope.go:117] "RemoveContainer" containerID="ceb0c94c04f56c2553f651c28f375c02ae1b955b20e010419230a4a5aff01519" Mar 08 03:22:24.818756 master-0 kubenswrapper[13046]: I0308 03:22:24.818608 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:24.819807 master-0 kubenswrapper[13046]: I0308 03:22:24.819725 13046 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:24.820649 master-0 kubenswrapper[13046]: I0308 03:22:24.820574 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:24.823665 master-0 kubenswrapper[13046]: I0308 03:22:24.823599 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:24.824919 master-0 kubenswrapper[13046]: I0308 03:22:24.824828 13046 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:24.846972 master-0 kubenswrapper[13046]: I0308 03:22:24.846906 13046 scope.go:117] "RemoveContainer" containerID="8267a2d84f723ff694dc31049976c14f450972f451107ec8a3714b4067dbd5aa" Mar 08 03:22:24.865339 master-0 kubenswrapper[13046]: I0308 03:22:24.865295 13046 scope.go:117] "RemoveContainer" containerID="72517ac9670d34df16c03c6560b187788f7f0baf22e95a4ce45b7d58900f22fc" Mar 08 03:22:24.888362 master-0 kubenswrapper[13046]: I0308 03:22:24.888296 13046 scope.go:117] "RemoveContainer" containerID="352601263b2ea037568f79eb419fdd95756531630d85b95824eedb3557887aab" Mar 08 03:22:24.916839 master-0 kubenswrapper[13046]: I0308 03:22:24.916756 13046 scope.go:117] "RemoveContainer" containerID="9b42a4f3cd06b596c0776aff41c17ae083724aac3b4bd87b457ee3501b6408f8" Mar 08 03:22:24.949636 master-0 kubenswrapper[13046]: I0308 03:22:24.949533 13046 scope.go:117] "RemoveContainer" containerID="5b012a30bc2b3cc713592db2b85b5cc01f37c9b84a3768f1b8abdd21b2236990" Mar 08 03:22:25.170410 master-0 kubenswrapper[13046]: E0308 03:22:25.170333 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:22:26.098233 master-0 kubenswrapper[13046]: E0308 03:22:26.097997 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189abfaa679559b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:899242a15b2bdf3b4a04fb323647ca94,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,LastTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:22:28.123974 master-0 kubenswrapper[13046]: I0308 03:22:28.123855 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:28.372888 master-0 kubenswrapper[13046]: E0308 03:22:28.372796 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:22:31.876828 master-0 kubenswrapper[13046]: I0308 03:22:31.876750 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-2-master-0_8a4e339f-ab04-4900-bf34-e683a2ed0eff/installer/0.log" Mar 08 03:22:31.877678 master-0 kubenswrapper[13046]: I0308 03:22:31.876832 13046 generic.go:334] "Generic (PLEG): container finished" podID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" containerID="b782441078f2f3eb173e5b4652d1cc7729790c4a9e322427c29ee670681b45e2" exitCode=1 Mar 08 03:22:31.877678 master-0 kubenswrapper[13046]: I0308 03:22:31.876873 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" event={"ID":"8a4e339f-ab04-4900-bf34-e683a2ed0eff","Type":"ContainerDied","Data":"b782441078f2f3eb173e5b4652d1cc7729790c4a9e322427c29ee670681b45e2"} Mar 08 03:22:31.878321 master-0 kubenswrapper[13046]: I0308 03:22:31.878201 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:31.880357 master-0 kubenswrapper[13046]: I0308 03:22:31.880287 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.118834 master-0 kubenswrapper[13046]: I0308 03:22:32.118763 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:22:32.119191 master-0 kubenswrapper[13046]: E0308 03:22:32.119131 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:22:32.888478 master-0 kubenswrapper[13046]: I0308 03:22:32.888372 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702" exitCode=1 Mar 08 03:22:32.889188 master-0 kubenswrapper[13046]: I0308 03:22:32.888468 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702"} Mar 08 03:22:32.889188 master-0 kubenswrapper[13046]: I0308 03:22:32.888591 13046 scope.go:117] "RemoveContainer" containerID="2548f09197f50d8cb959dd110f3a56a4fb32fbf469012609ca083cfd96f66597" Mar 08 03:22:32.889435 master-0 kubenswrapper[13046]: I0308 03:22:32.889368 13046 scope.go:117] "RemoveContainer" containerID="948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702" Mar 08 03:22:32.889839 master-0 kubenswrapper[13046]: E0308 03:22:32.889790 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(a1a56802af72ce1aac6b5077f1695ac0)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" Mar 08 03:22:32.890367 master-0 kubenswrapper[13046]: I0308 03:22:32.890187 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.891839 master-0 kubenswrapper[13046]: I0308 03:22:32.891261 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.893619 master-0 kubenswrapper[13046]: I0308 03:22:32.892611 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.893619 master-0 kubenswrapper[13046]: I0308 03:22:32.892965 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-2-master-0_b9eb9745-3670-4b36-86fa-9da2aad5a9d4/installer/0.log" Mar 08 03:22:32.893619 master-0 kubenswrapper[13046]: I0308 03:22:32.893156 13046 generic.go:334] "Generic (PLEG): container finished" podID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" containerID="c2ba556f3be879782ef9b91d6f546eeb856368d1d454f2b1ac52d44228e79b29" exitCode=1 Mar 08 03:22:32.893619 master-0 kubenswrapper[13046]: I0308 03:22:32.893220 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" event={"ID":"b9eb9745-3670-4b36-86fa-9da2aad5a9d4","Type":"ContainerDied","Data":"c2ba556f3be879782ef9b91d6f546eeb856368d1d454f2b1ac52d44228e79b29"} Mar 08 03:22:32.894823 master-0 kubenswrapper[13046]: I0308 03:22:32.894763 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.895874 master-0 kubenswrapper[13046]: I0308 03:22:32.895784 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.896788 master-0 kubenswrapper[13046]: I0308 03:22:32.896702 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:32.897719 master-0 kubenswrapper[13046]: I0308 03:22:32.897641 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.290718 master-0 kubenswrapper[13046]: I0308 03:22:33.290660 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-2-master-0_8a4e339f-ab04-4900-bf34-e683a2ed0eff/installer/0.log" Mar 08 03:22:33.290907 master-0 kubenswrapper[13046]: I0308 03:22:33.290766 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:22:33.292035 master-0 kubenswrapper[13046]: I0308 03:22:33.291964 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.293024 master-0 kubenswrapper[13046]: I0308 03:22:33.292956 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.293812 master-0 kubenswrapper[13046]: I0308 03:22:33.293745 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.294852 master-0 kubenswrapper[13046]: I0308 03:22:33.294785 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.389636 master-0 kubenswrapper[13046]: I0308 03:22:33.389585 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir\") pod \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " Mar 08 03:22:33.390012 master-0 kubenswrapper[13046]: I0308 03:22:33.389657 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock\") pod \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " Mar 08 03:22:33.390012 master-0 kubenswrapper[13046]: I0308 03:22:33.389683 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8a4e339f-ab04-4900-bf34-e683a2ed0eff" (UID: "8a4e339f-ab04-4900-bf34-e683a2ed0eff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:33.390012 master-0 kubenswrapper[13046]: I0308 03:22:33.389750 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access\") pod \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\" (UID: \"8a4e339f-ab04-4900-bf34-e683a2ed0eff\") " Mar 08 03:22:33.390351 master-0 kubenswrapper[13046]: I0308 03:22:33.390281 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock" (OuterVolumeSpecName: "var-lock") pod "8a4e339f-ab04-4900-bf34-e683a2ed0eff" (UID: "8a4e339f-ab04-4900-bf34-e683a2ed0eff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:33.390946 master-0 kubenswrapper[13046]: I0308 03:22:33.390906 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:33.391010 master-0 kubenswrapper[13046]: I0308 03:22:33.390989 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a4e339f-ab04-4900-bf34-e683a2ed0eff-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:33.394972 master-0 kubenswrapper[13046]: I0308 03:22:33.394914 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8a4e339f-ab04-4900-bf34-e683a2ed0eff" (UID: "8a4e339f-ab04-4900-bf34-e683a2ed0eff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:22:33.492363 master-0 kubenswrapper[13046]: I0308 03:22:33.492248 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a4e339f-ab04-4900-bf34-e683a2ed0eff-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:33.906183 master-0 kubenswrapper[13046]: I0308 03:22:33.906069 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-retry-2-master-0_8a4e339f-ab04-4900-bf34-e683a2ed0eff/installer/0.log" Mar 08 03:22:33.907113 master-0 kubenswrapper[13046]: I0308 03:22:33.906393 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" Mar 08 03:22:33.907113 master-0 kubenswrapper[13046]: I0308 03:22:33.906471 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" event={"ID":"8a4e339f-ab04-4900-bf34-e683a2ed0eff","Type":"ContainerDied","Data":"1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed"} Mar 08 03:22:33.907113 master-0 kubenswrapper[13046]: I0308 03:22:33.906581 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1919a99ec253fdf73d4a313c3a7606bcbd70f1ae352a75b72050478c5b70ceed" Mar 08 03:22:33.929401 master-0 kubenswrapper[13046]: I0308 03:22:33.929023 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.930389 master-0 kubenswrapper[13046]: I0308 03:22:33.929954 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.930882 master-0 kubenswrapper[13046]: I0308 03:22:33.930806 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:33.931822 master-0 kubenswrapper[13046]: I0308 03:22:33.931757 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.240062 master-0 kubenswrapper[13046]: I0308 03:22:34.239277 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-2-master-0_b9eb9745-3670-4b36-86fa-9da2aad5a9d4/installer/0.log" Mar 08 03:22:34.240062 master-0 kubenswrapper[13046]: I0308 03:22:34.239380 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:22:34.240583 master-0 kubenswrapper[13046]: I0308 03:22:34.240453 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.241149 master-0 kubenswrapper[13046]: I0308 03:22:34.241090 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.241956 master-0 kubenswrapper[13046]: I0308 03:22:34.241871 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.242859 master-0 kubenswrapper[13046]: I0308 03:22:34.242781 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.304742 master-0 kubenswrapper[13046]: I0308 03:22:34.304684 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir\") pod \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " Mar 08 03:22:34.305099 master-0 kubenswrapper[13046]: I0308 03:22:34.305061 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access\") pod \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " Mar 08 03:22:34.305449 master-0 kubenswrapper[13046]: I0308 03:22:34.305419 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock\") pod \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\" (UID: \"b9eb9745-3670-4b36-86fa-9da2aad5a9d4\") " Mar 08 03:22:34.305817 master-0 kubenswrapper[13046]: I0308 03:22:34.304867 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b9eb9745-3670-4b36-86fa-9da2aad5a9d4" (UID: "b9eb9745-3670-4b36-86fa-9da2aad5a9d4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:34.305817 master-0 kubenswrapper[13046]: I0308 03:22:34.305538 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock" (OuterVolumeSpecName: "var-lock") pod "b9eb9745-3670-4b36-86fa-9da2aad5a9d4" (UID: "b9eb9745-3670-4b36-86fa-9da2aad5a9d4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:22:34.306340 master-0 kubenswrapper[13046]: I0308 03:22:34.306300 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:34.306542 master-0 kubenswrapper[13046]: I0308 03:22:34.306519 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:34.309699 master-0 kubenswrapper[13046]: I0308 03:22:34.309649 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b9eb9745-3670-4b36-86fa-9da2aad5a9d4" (UID: "b9eb9745-3670-4b36-86fa-9da2aad5a9d4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:22:34.408703 master-0 kubenswrapper[13046]: I0308 03:22:34.408609 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9eb9745-3670-4b36-86fa-9da2aad5a9d4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:22:34.773898 master-0 kubenswrapper[13046]: E0308 03:22:34.773842 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 08 03:22:34.918191 master-0 kubenswrapper[13046]: I0308 03:22:34.917999 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-2-master-0_b9eb9745-3670-4b36-86fa-9da2aad5a9d4/installer/0.log" Mar 08 03:22:34.918191 master-0 kubenswrapper[13046]: I0308 03:22:34.918094 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" event={"ID":"b9eb9745-3670-4b36-86fa-9da2aad5a9d4","Type":"ContainerDied","Data":"286fa2c5bbdb352c300271a312061ff951294c02b169409dc01a75b23eeb68e1"} Mar 08 03:22:34.918191 master-0 kubenswrapper[13046]: I0308 03:22:34.918135 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286fa2c5bbdb352c300271a312061ff951294c02b169409dc01a75b23eeb68e1" Mar 08 03:22:34.919090 master-0 kubenswrapper[13046]: I0308 03:22:34.918247 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" Mar 08 03:22:34.948138 master-0 kubenswrapper[13046]: I0308 03:22:34.948041 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.948962 master-0 kubenswrapper[13046]: I0308 03:22:34.948876 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.949929 master-0 kubenswrapper[13046]: I0308 03:22:34.949859 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:34.951657 master-0 kubenswrapper[13046]: I0308 03:22:34.951572 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.100419 master-0 kubenswrapper[13046]: E0308 03:22:36.100223 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189abfaa679559b3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:899242a15b2bdf3b4a04fb323647ca94,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,LastTimestamp:2026-03-08 03:22:21.926898099 +0000 UTC m=+544.005665346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:22:36.118123 master-0 kubenswrapper[13046]: I0308 03:22:36.117772 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:36.119561 master-0 kubenswrapper[13046]: I0308 03:22:36.119385 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.120732 master-0 kubenswrapper[13046]: I0308 03:22:36.120508 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.123782 master-0 kubenswrapper[13046]: I0308 03:22:36.121451 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.123782 master-0 kubenswrapper[13046]: I0308 03:22:36.122300 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.145296 master-0 kubenswrapper[13046]: I0308 03:22:36.145120 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:36.145296 master-0 kubenswrapper[13046]: I0308 03:22:36.145172 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:36.146692 master-0 kubenswrapper[13046]: E0308 03:22:36.146610 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:36.147277 master-0 kubenswrapper[13046]: I0308 03:22:36.147198 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:36.174035 master-0 kubenswrapper[13046]: W0308 03:22:36.173967 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077dd10388b9e3e48a07382126e86621.slice/crio-38626d5ea61917d7aed5b0b0dfa04eec9fc49f488cebe4ed87e6f63c57546c51 WatchSource:0}: Error finding container 38626d5ea61917d7aed5b0b0dfa04eec9fc49f488cebe4ed87e6f63c57546c51: Status 404 returned error can't find the container with id 38626d5ea61917d7aed5b0b0dfa04eec9fc49f488cebe4ed87e6f63c57546c51 Mar 08 03:22:36.937620 master-0 kubenswrapper[13046]: I0308 03:22:36.937530 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9" exitCode=0 Mar 08 03:22:36.937620 master-0 kubenswrapper[13046]: I0308 03:22:36.937570 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9"} Mar 08 03:22:36.937620 master-0 kubenswrapper[13046]: I0308 03:22:36.937596 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"38626d5ea61917d7aed5b0b0dfa04eec9fc49f488cebe4ed87e6f63c57546c51"} Mar 08 03:22:36.938249 master-0 kubenswrapper[13046]: I0308 03:22:36.937845 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:36.938249 master-0 kubenswrapper[13046]: I0308 03:22:36.937858 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:36.938892 master-0 kubenswrapper[13046]: I0308 03:22:36.938807 13046 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.939004 master-0 kubenswrapper[13046]: E0308 03:22:36.938878 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:36.942325 master-0 kubenswrapper[13046]: I0308 03:22:36.942228 13046 status_manager.go:851] "Failed to get status for pod" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" pod="openshift-kube-controller-manager/installer-2-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.943654 master-0 kubenswrapper[13046]: I0308 03:22:36.943581 13046 status_manager.go:851] "Failed to get status for pod" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:36.944439 master-0 kubenswrapper[13046]: I0308 03:22:36.944388 13046 status_manager.go:851] "Failed to get status for pod" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" pod="openshift-kube-scheduler/installer-4-retry-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/installer-4-retry-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:22:37.962107 master-0 kubenswrapper[13046]: I0308 03:22:37.962051 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b"} Mar 08 03:22:37.962107 master-0 kubenswrapper[13046]: I0308 03:22:37.962094 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4"} Mar 08 03:22:37.962107 master-0 kubenswrapper[13046]: I0308 03:22:37.962108 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59"} Mar 08 03:22:38.978060 master-0 kubenswrapper[13046]: I0308 03:22:38.978002 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62"} Mar 08 03:22:38.978060 master-0 kubenswrapper[13046]: I0308 03:22:38.978062 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b"} Mar 08 03:22:38.978629 master-0 kubenswrapper[13046]: I0308 03:22:38.978209 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:38.978629 master-0 kubenswrapper[13046]: I0308 03:22:38.978329 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:38.978629 master-0 kubenswrapper[13046]: I0308 03:22:38.978359 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:41.148517 master-0 kubenswrapper[13046]: I0308 03:22:41.148397 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:41.148517 master-0 kubenswrapper[13046]: I0308 03:22:41.148475 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:41.157281 master-0 kubenswrapper[13046]: I0308 03:22:41.157218 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:43.118746 master-0 kubenswrapper[13046]: I0308 03:22:43.118669 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:22:43.119708 master-0 kubenswrapper[13046]: E0308 03:22:43.119153 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:22:44.028019 master-0 kubenswrapper[13046]: I0308 03:22:44.027973 13046 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:44.117138 master-0 kubenswrapper[13046]: I0308 03:22:44.116982 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:22:44.118826 master-0 kubenswrapper[13046]: I0308 03:22:44.118782 13046 scope.go:117] "RemoveContainer" containerID="948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702" Mar 08 03:22:44.119176 master-0 kubenswrapper[13046]: E0308 03:22:44.119003 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(a1a56802af72ce1aac6b5077f1695ac0)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" Mar 08 03:22:45.046081 master-0 kubenswrapper[13046]: I0308 03:22:45.045979 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:45.046081 master-0 kubenswrapper[13046]: I0308 03:22:45.046035 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:45.052475 master-0 kubenswrapper[13046]: I0308 03:22:45.052396 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:22:45.052903 master-0 kubenswrapper[13046]: I0308 03:22:45.052867 13046 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" Mar 08 03:22:45.053049 master-0 kubenswrapper[13046]: I0308 03:22:45.053027 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:22:46.056178 master-0 kubenswrapper[13046]: I0308 03:22:46.056084 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:46.056178 master-0 kubenswrapper[13046]: I0308 03:22:46.056142 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:22:46.059772 master-0 kubenswrapper[13046]: I0308 03:22:46.059699 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:22:53.356299 master-0 kubenswrapper[13046]: I0308 03:22:53.356220 13046 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:22:53.712819 master-0 kubenswrapper[13046]: I0308 03:22:53.712567 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:22:54.638668 master-0 kubenswrapper[13046]: I0308 03:22:54.638582 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:22:54.681987 master-0 kubenswrapper[13046]: I0308 03:22:54.681917 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:22:55.118021 master-0 kubenswrapper[13046]: I0308 03:22:55.117926 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:22:55.118550 master-0 kubenswrapper[13046]: E0308 03:22:55.118270 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:22:55.122258 master-0 kubenswrapper[13046]: I0308 03:22:55.122187 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:22:55.499576 master-0 kubenswrapper[13046]: I0308 03:22:55.499389 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:22:55.508373 master-0 kubenswrapper[13046]: I0308 03:22:55.508315 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:22:55.510837 master-0 kubenswrapper[13046]: I0308 03:22:55.510769 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:22:55.636659 master-0 kubenswrapper[13046]: I0308 03:22:55.636552 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:22:56.038072 master-0 kubenswrapper[13046]: I0308 03:22:56.037975 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:22:56.118796 master-0 kubenswrapper[13046]: I0308 03:22:56.118689 13046 scope.go:117] "RemoveContainer" containerID="948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702" Mar 08 03:22:56.362633 master-0 kubenswrapper[13046]: I0308 03:22:56.360745 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:22:56.412793 master-0 kubenswrapper[13046]: I0308 03:22:56.412746 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:22:56.441421 master-0 kubenswrapper[13046]: I0308 03:22:56.441324 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:22:56.498454 master-0 kubenswrapper[13046]: I0308 03:22:56.498374 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:22:57.098864 master-0 kubenswrapper[13046]: I0308 03:22:57.098796 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:22:57.149850 master-0 kubenswrapper[13046]: I0308 03:22:57.149801 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"94498e732862075a2e7db935be27489aa50ccc49f721fd0c7a11e45e6a5920c8"} Mar 08 03:22:57.255144 master-0 kubenswrapper[13046]: I0308 03:22:57.255054 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:22:57.301925 master-0 kubenswrapper[13046]: I0308 03:22:57.301833 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-f5lxw" Mar 08 03:22:57.306137 master-0 kubenswrapper[13046]: I0308 03:22:57.306057 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:22:57.352873 master-0 kubenswrapper[13046]: I0308 03:22:57.352722 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:22:57.368413 master-0 kubenswrapper[13046]: I0308 03:22:57.368359 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:22:57.447707 master-0 kubenswrapper[13046]: I0308 03:22:57.447638 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:22:57.667790 master-0 kubenswrapper[13046]: I0308 03:22:57.667611 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:22:57.702288 master-0 kubenswrapper[13046]: I0308 03:22:57.702232 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:22:57.793157 master-0 kubenswrapper[13046]: I0308 03:22:57.793067 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-rkgwg" Mar 08 03:22:57.867913 master-0 kubenswrapper[13046]: I0308 03:22:57.867832 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:22:57.922562 master-0 kubenswrapper[13046]: I0308 03:22:57.922374 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:22:58.003884 master-0 kubenswrapper[13046]: I0308 03:22:58.003806 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:22:58.025316 master-0 kubenswrapper[13046]: I0308 03:22:58.025272 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:22:58.029544 master-0 kubenswrapper[13046]: I0308 03:22:58.029457 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:22:58.064430 master-0 kubenswrapper[13046]: I0308 03:22:58.064347 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:22:58.075508 master-0 kubenswrapper[13046]: I0308 03:22:58.075426 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:22:58.122551 master-0 kubenswrapper[13046]: I0308 03:22:58.122466 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:22:58.177972 master-0 kubenswrapper[13046]: I0308 03:22:58.177830 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:22:58.222181 master-0 kubenswrapper[13046]: I0308 03:22:58.222149 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:22:58.340996 master-0 kubenswrapper[13046]: I0308 03:22:58.340933 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:22:58.341415 master-0 kubenswrapper[13046]: I0308 03:22:58.341382 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:22:58.509216 master-0 kubenswrapper[13046]: I0308 03:22:58.509121 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:22:58.615666 master-0 kubenswrapper[13046]: I0308 03:22:58.615615 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:22:58.663715 master-0 kubenswrapper[13046]: I0308 03:22:58.663663 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:22:58.801075 master-0 kubenswrapper[13046]: I0308 03:22:58.800924 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:22:58.884776 master-0 kubenswrapper[13046]: I0308 03:22:58.884713 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:22:58.932715 master-0 kubenswrapper[13046]: I0308 03:22:58.932676 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-29dgn" Mar 08 03:22:59.069876 master-0 kubenswrapper[13046]: I0308 03:22:59.069716 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:22:59.218982 master-0 kubenswrapper[13046]: I0308 03:22:59.218924 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-xfphx" Mar 08 03:22:59.293688 master-0 kubenswrapper[13046]: I0308 03:22:59.293602 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:22:59.405617 master-0 kubenswrapper[13046]: I0308 03:22:59.405313 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:22:59.516185 master-0 kubenswrapper[13046]: I0308 03:22:59.516109 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:22:59.592346 master-0 kubenswrapper[13046]: I0308 03:22:59.592267 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:22:59.615612 master-0 kubenswrapper[13046]: I0308 03:22:59.615542 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:22:59.634959 master-0 kubenswrapper[13046]: I0308 03:22:59.634903 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:22:59.637624 master-0 kubenswrapper[13046]: I0308 03:22:59.637290 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:22:59.669261 master-0 kubenswrapper[13046]: I0308 03:22:59.669127 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:22:59.686577 master-0 kubenswrapper[13046]: I0308 03:22:59.686547 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:22:59.802263 master-0 kubenswrapper[13046]: I0308 03:22:59.802157 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:22:59.913762 master-0 kubenswrapper[13046]: I0308 03:22:59.913689 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:23:00.061214 master-0 kubenswrapper[13046]: I0308 03:23:00.061062 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:23:00.131241 master-0 kubenswrapper[13046]: I0308 03:23:00.131154 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:23:00.140914 master-0 kubenswrapper[13046]: I0308 03:23:00.140837 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:23:00.237428 master-0 kubenswrapper[13046]: I0308 03:23:00.237354 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:23:00.248747 master-0 kubenswrapper[13046]: I0308 03:23:00.248701 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:23:00.287074 master-0 kubenswrapper[13046]: I0308 03:23:00.286990 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:23:00.302000 master-0 kubenswrapper[13046]: I0308 03:23:00.301922 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:23:00.321447 master-0 kubenswrapper[13046]: I0308 03:23:00.321282 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:23:00.362417 master-0 kubenswrapper[13046]: I0308 03:23:00.361129 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:23:00.490541 master-0 kubenswrapper[13046]: I0308 03:23:00.490441 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:23:00.536760 master-0 kubenswrapper[13046]: I0308 03:23:00.536657 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:23:00.700132 master-0 kubenswrapper[13046]: I0308 03:23:00.700055 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-sbm8j" Mar 08 03:23:00.809767 master-0 kubenswrapper[13046]: I0308 03:23:00.809665 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:23:00.828374 master-0 kubenswrapper[13046]: I0308 03:23:00.828283 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:23:00.943151 master-0 kubenswrapper[13046]: I0308 03:23:00.943047 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:23:01.016130 master-0 kubenswrapper[13046]: I0308 03:23:01.015964 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-r4dpg" Mar 08 03:23:01.016786 master-0 kubenswrapper[13046]: I0308 03:23:01.016710 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:23:01.035845 master-0 kubenswrapper[13046]: I0308 03:23:01.035775 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:23:01.151834 master-0 kubenswrapper[13046]: I0308 03:23:01.151741 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:23:01.181762 master-0 kubenswrapper[13046]: I0308 03:23:01.181696 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:23:01.221052 master-0 kubenswrapper[13046]: I0308 03:23:01.220956 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:23:01.235019 master-0 kubenswrapper[13046]: I0308 03:23:01.234940 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:23:01.408455 master-0 kubenswrapper[13046]: I0308 03:23:01.408381 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:23:01.635114 master-0 kubenswrapper[13046]: I0308 03:23:01.635031 13046 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:23:01.685767 master-0 kubenswrapper[13046]: I0308 03:23:01.685620 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:23:01.686555 master-0 kubenswrapper[13046]: I0308 03:23:01.686517 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:23:01.752823 master-0 kubenswrapper[13046]: I0308 03:23:01.752722 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:23:01.772281 master-0 kubenswrapper[13046]: I0308 03:23:01.771980 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:23:01.975038 master-0 kubenswrapper[13046]: I0308 03:23:01.974883 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:23:02.009824 master-0 kubenswrapper[13046]: I0308 03:23:02.009741 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:23:02.010817 master-0 kubenswrapper[13046]: I0308 03:23:02.010765 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l8646" Mar 08 03:23:02.157518 master-0 kubenswrapper[13046]: I0308 03:23:02.157379 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:23:02.199511 master-0 kubenswrapper[13046]: I0308 03:23:02.199405 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-dmv4m" Mar 08 03:23:02.233700 master-0 kubenswrapper[13046]: I0308 03:23:02.233525 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-lwkgm" Mar 08 03:23:02.235927 master-0 kubenswrapper[13046]: I0308 03:23:02.235866 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:23:02.304257 master-0 kubenswrapper[13046]: I0308 03:23:02.304133 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:23:02.309725 master-0 kubenswrapper[13046]: I0308 03:23:02.309646 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:23:02.357134 master-0 kubenswrapper[13046]: I0308 03:23:02.356999 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-r5m92" Mar 08 03:23:02.381510 master-0 kubenswrapper[13046]: I0308 03:23:02.381388 13046 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:23:02.384242 master-0 kubenswrapper[13046]: I0308 03:23:02.384175 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:23:02.485360 master-0 kubenswrapper[13046]: I0308 03:23:02.485219 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:23:02.544632 master-0 kubenswrapper[13046]: I0308 03:23:02.544539 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:23:02.588145 master-0 kubenswrapper[13046]: I0308 03:23:02.588092 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:23:02.591819 master-0 kubenswrapper[13046]: I0308 03:23:02.591768 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:23:02.596560 master-0 kubenswrapper[13046]: I0308 03:23:02.596469 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-5lx9s" Mar 08 03:23:02.615078 master-0 kubenswrapper[13046]: I0308 03:23:02.615028 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:23:02.719800 master-0 kubenswrapper[13046]: I0308 03:23:02.719717 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:23:02.804530 master-0 kubenswrapper[13046]: I0308 03:23:02.804302 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:23:02.863378 master-0 kubenswrapper[13046]: I0308 03:23:02.863295 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:23:02.872864 master-0 kubenswrapper[13046]: I0308 03:23:02.872811 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:23:02.891937 master-0 kubenswrapper[13046]: I0308 03:23:02.891806 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:23:02.915451 master-0 kubenswrapper[13046]: I0308 03:23:02.915355 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:23:02.981084 master-0 kubenswrapper[13046]: I0308 03:23:02.980995 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:23:02.989119 master-0 kubenswrapper[13046]: I0308 03:23:02.989071 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:23:03.072587 master-0 kubenswrapper[13046]: I0308 03:23:03.072355 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:23:03.218942 master-0 kubenswrapper[13046]: I0308 03:23:03.218860 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:23:03.403796 master-0 kubenswrapper[13046]: I0308 03:23:03.403714 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:23:03.422766 master-0 kubenswrapper[13046]: I0308 03:23:03.422694 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:23:03.463235 master-0 kubenswrapper[13046]: I0308 03:23:03.463137 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:23:03.524014 master-0 kubenswrapper[13046]: I0308 03:23:03.523960 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:23:03.558331 master-0 kubenswrapper[13046]: I0308 03:23:03.558230 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:23:03.644748 master-0 kubenswrapper[13046]: I0308 03:23:03.644651 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:23:03.740559 master-0 kubenswrapper[13046]: I0308 03:23:03.740321 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:23:03.760436 master-0 kubenswrapper[13046]: I0308 03:23:03.760334 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:23:03.920743 master-0 kubenswrapper[13046]: I0308 03:23:03.920675 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-4dw5m" Mar 08 03:23:03.921590 master-0 kubenswrapper[13046]: I0308 03:23:03.921473 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qktgm" Mar 08 03:23:04.017841 master-0 kubenswrapper[13046]: I0308 03:23:04.017709 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:23:04.036982 master-0 kubenswrapper[13046]: I0308 03:23:04.036920 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:23:04.045666 master-0 kubenswrapper[13046]: I0308 03:23:04.045598 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:23:04.047977 master-0 kubenswrapper[13046]: I0308 03:23:04.047927 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:23:04.192561 master-0 kubenswrapper[13046]: I0308 03:23:04.192408 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:23:04.267501 master-0 kubenswrapper[13046]: I0308 03:23:04.267420 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:23:04.321584 master-0 kubenswrapper[13046]: I0308 03:23:04.321461 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:23:04.368983 master-0 kubenswrapper[13046]: I0308 03:23:04.368934 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:23:04.402858 master-0 kubenswrapper[13046]: I0308 03:23:04.402801 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:23:04.403298 master-0 kubenswrapper[13046]: I0308 03:23:04.403251 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:23:04.487278 master-0 kubenswrapper[13046]: I0308 03:23:04.487169 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:23:04.491003 master-0 kubenswrapper[13046]: I0308 03:23:04.490944 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:23:04.543048 master-0 kubenswrapper[13046]: I0308 03:23:04.542960 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:23:04.624728 master-0 kubenswrapper[13046]: I0308 03:23:04.624605 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:23:04.630177 master-0 kubenswrapper[13046]: I0308 03:23:04.630123 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:23:04.769262 master-0 kubenswrapper[13046]: I0308 03:23:04.769212 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:23:04.839025 master-0 kubenswrapper[13046]: I0308 03:23:04.838966 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:23:04.978584 master-0 kubenswrapper[13046]: I0308 03:23:04.978107 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-chsmd" Mar 08 03:23:05.081142 master-0 kubenswrapper[13046]: I0308 03:23:05.081065 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:23:05.084070 master-0 kubenswrapper[13046]: I0308 03:23:05.084036 13046 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:23:05.365723 master-0 kubenswrapper[13046]: I0308 03:23:05.365603 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:23:05.384462 master-0 kubenswrapper[13046]: I0308 03:23:05.384407 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:23:05.393108 master-0 kubenswrapper[13046]: I0308 03:23:05.393059 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:23:05.425557 master-0 kubenswrapper[13046]: I0308 03:23:05.425451 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:23:05.459531 master-0 kubenswrapper[13046]: I0308 03:23:05.459453 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:23:05.530722 master-0 kubenswrapper[13046]: I0308 03:23:05.530654 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:23:05.578509 master-0 kubenswrapper[13046]: I0308 03:23:05.578398 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:23:05.580194 master-0 kubenswrapper[13046]: I0308 03:23:05.580095 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:23:05.608085 master-0 kubenswrapper[13046]: I0308 03:23:05.608031 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:23:05.666201 master-0 kubenswrapper[13046]: I0308 03:23:05.665969 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:23:05.842240 master-0 kubenswrapper[13046]: I0308 03:23:05.842120 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:23:06.021570 master-0 kubenswrapper[13046]: I0308 03:23:06.021268 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:23:06.023288 master-0 kubenswrapper[13046]: I0308 03:23:06.023236 13046 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:23:06.025674 master-0 kubenswrapper[13046]: I0308 03:23:06.025599 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:23:06.032798 master-0 kubenswrapper[13046]: I0308 03:23:06.032744 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:23:06.033092 master-0 kubenswrapper[13046]: I0308 03:23:06.033058 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:23:06.033756 master-0 kubenswrapper[13046]: I0308 03:23:06.033687 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:23:06.033756 master-0 kubenswrapper[13046]: I0308 03:23:06.033741 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6b32677f-b96e-4c3b-a91b-5f993d3681c7" Mar 08 03:23:06.041088 master-0 kubenswrapper[13046]: I0308 03:23:06.041051 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:23:06.075291 master-0 kubenswrapper[13046]: I0308 03:23:06.075216 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:23:06.088008 master-0 kubenswrapper[13046]: I0308 03:23:06.087891 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=22.087870166 podStartE2EDuration="22.087870166s" podCreationTimestamp="2026-03-08 03:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:23:06.06868758 +0000 UTC m=+588.147454837" watchObservedRunningTime="2026-03-08 03:23:06.087870166 +0000 UTC m=+588.166637393" Mar 08 03:23:06.125134 master-0 kubenswrapper[13046]: I0308 03:23:06.123784 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:23:06.155198 master-0 kubenswrapper[13046]: I0308 03:23:06.154577 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:23:06.322819 master-0 kubenswrapper[13046]: I0308 03:23:06.322637 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:23:06.390138 master-0 kubenswrapper[13046]: I0308 03:23:06.390049 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:23:06.403244 master-0 kubenswrapper[13046]: I0308 03:23:06.403194 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:23:06.451362 master-0 kubenswrapper[13046]: I0308 03:23:06.451291 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:23:06.552883 master-0 kubenswrapper[13046]: I0308 03:23:06.552844 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:23:06.610028 master-0 kubenswrapper[13046]: I0308 03:23:06.609970 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:23:06.621731 master-0 kubenswrapper[13046]: I0308 03:23:06.621670 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:23:06.662468 master-0 kubenswrapper[13046]: I0308 03:23:06.655869 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:23:06.671742 master-0 kubenswrapper[13046]: I0308 03:23:06.671660 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:23:06.672115 master-0 kubenswrapper[13046]: I0308 03:23:06.672019 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" containerID="cri-o://62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0" gracePeriod=5 Mar 08 03:23:06.681908 master-0 kubenswrapper[13046]: I0308 03:23:06.681788 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:23:06.691364 master-0 kubenswrapper[13046]: I0308 03:23:06.691284 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:23:06.750114 master-0 kubenswrapper[13046]: I0308 03:23:06.750044 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:23:06.765396 master-0 kubenswrapper[13046]: I0308 03:23:06.765316 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:23:06.828894 master-0 kubenswrapper[13046]: I0308 03:23:06.828841 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:23:07.197218 master-0 kubenswrapper[13046]: I0308 03:23:07.197140 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:23:07.399524 master-0 kubenswrapper[13046]: I0308 03:23:07.399422 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:23:07.502844 master-0 kubenswrapper[13046]: I0308 03:23:07.502512 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:23:07.565732 master-0 kubenswrapper[13046]: I0308 03:23:07.565653 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:23:07.587735 master-0 kubenswrapper[13046]: I0308 03:23:07.587681 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:23:07.695376 master-0 kubenswrapper[13046]: I0308 03:23:07.695313 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:23:07.701161 master-0 kubenswrapper[13046]: I0308 03:23:07.701115 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:23:07.846978 master-0 kubenswrapper[13046]: I0308 03:23:07.846886 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:23:07.849839 master-0 kubenswrapper[13046]: I0308 03:23:07.849768 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:23:07.947440 master-0 kubenswrapper[13046]: I0308 03:23:07.947367 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:23:07.964024 master-0 kubenswrapper[13046]: I0308 03:23:07.963974 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:23:08.013965 master-0 kubenswrapper[13046]: I0308 03:23:08.013902 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:23:08.128423 master-0 kubenswrapper[13046]: I0308 03:23:08.128255 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:23:08.245729 master-0 kubenswrapper[13046]: I0308 03:23:08.245622 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:23:08.268849 master-0 kubenswrapper[13046]: I0308 03:23:08.268765 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:23:08.270478 master-0 kubenswrapper[13046]: I0308 03:23:08.270416 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:23:08.294124 master-0 kubenswrapper[13046]: I0308 03:23:08.294057 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:23:08.384875 master-0 kubenswrapper[13046]: I0308 03:23:08.384733 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:23:08.548669 master-0 kubenswrapper[13046]: I0308 03:23:08.548578 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:23:08.598726 master-0 kubenswrapper[13046]: I0308 03:23:08.598666 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:23:08.671072 master-0 kubenswrapper[13046]: I0308 03:23:08.670922 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:23:08.762964 master-0 kubenswrapper[13046]: I0308 03:23:08.762892 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-svw57" Mar 08 03:23:09.117286 master-0 kubenswrapper[13046]: I0308 03:23:09.117217 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:23:09.118857 master-0 kubenswrapper[13046]: I0308 03:23:09.118806 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:23:09.119211 master-0 kubenswrapper[13046]: E0308 03:23:09.119159 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:23:09.178047 master-0 kubenswrapper[13046]: I0308 03:23:09.177968 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:23:09.229977 master-0 kubenswrapper[13046]: I0308 03:23:09.229916 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:23:09.242682 master-0 kubenswrapper[13046]: I0308 03:23:09.242623 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:23:09.378890 master-0 kubenswrapper[13046]: I0308 03:23:09.378407 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:23:09.635610 master-0 kubenswrapper[13046]: I0308 03:23:09.635467 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:23:09.709628 master-0 kubenswrapper[13046]: I0308 03:23:09.709586 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:23:09.774280 master-0 kubenswrapper[13046]: I0308 03:23:09.774223 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:23:09.989794 master-0 kubenswrapper[13046]: I0308 03:23:09.989225 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:23:10.028599 master-0 kubenswrapper[13046]: I0308 03:23:10.028553 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:23:10.490386 master-0 kubenswrapper[13046]: I0308 03:23:10.490290 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:23:11.355113 master-0 kubenswrapper[13046]: I0308 03:23:11.355017 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:23:12.261639 master-0 kubenswrapper[13046]: I0308 03:23:12.261565 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 08 03:23:12.261783 master-0 kubenswrapper[13046]: I0308 03:23:12.261715 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:23:12.266554 master-0 kubenswrapper[13046]: I0308 03:23:12.266450 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:23:12.266645 master-0 kubenswrapper[13046]: I0308 03:23:12.266603 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log" (OuterVolumeSpecName: "var-log") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:23:12.266703 master-0 kubenswrapper[13046]: I0308 03:23:12.266624 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:23:12.266703 master-0 kubenswrapper[13046]: I0308 03:23:12.266687 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock" (OuterVolumeSpecName: "var-lock") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:23:12.266772 master-0 kubenswrapper[13046]: I0308 03:23:12.266720 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:23:12.266807 master-0 kubenswrapper[13046]: I0308 03:23:12.266789 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:23:12.266984 master-0 kubenswrapper[13046]: I0308 03:23:12.266930 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:23:12.267278 master-0 kubenswrapper[13046]: I0308 03:23:12.267235 13046 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:23:12.267325 master-0 kubenswrapper[13046]: I0308 03:23:12.267279 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:23:12.267325 master-0 kubenswrapper[13046]: I0308 03:23:12.267299 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:23:12.274991 master-0 kubenswrapper[13046]: I0308 03:23:12.274927 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:23:12.275444 master-0 kubenswrapper[13046]: I0308 03:23:12.275389 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 08 03:23:12.275532 master-0 kubenswrapper[13046]: I0308 03:23:12.275479 13046 generic.go:334] "Generic (PLEG): container finished" podID="899242a15b2bdf3b4a04fb323647ca94" containerID="62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0" exitCode=137 Mar 08 03:23:12.275595 master-0 kubenswrapper[13046]: I0308 03:23:12.275573 13046 scope.go:117] "RemoveContainer" containerID="62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0" Mar 08 03:23:12.275729 master-0 kubenswrapper[13046]: I0308 03:23:12.275677 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:23:12.323552 master-0 kubenswrapper[13046]: I0308 03:23:12.323446 13046 scope.go:117] "RemoveContainer" containerID="62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0" Mar 08 03:23:12.324550 master-0 kubenswrapper[13046]: E0308 03:23:12.324458 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0\": container with ID starting with 62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0 not found: ID does not exist" containerID="62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0" Mar 08 03:23:12.324862 master-0 kubenswrapper[13046]: I0308 03:23:12.324766 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0"} err="failed to get container status \"62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0\": rpc error: code = NotFound desc = could not find container \"62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0\": container with ID starting with 62191d16c7e1dce625324df2d2dfb97fd4f0ed8f8d264dfc40d3e2b764b4def0 not found: ID does not exist" Mar 08 03:23:12.368022 master-0 kubenswrapper[13046]: I0308 03:23:12.367964 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 08 03:23:12.368799 master-0 kubenswrapper[13046]: I0308 03:23:12.368542 13046 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:23:12.368799 master-0 kubenswrapper[13046]: I0308 03:23:12.368724 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests" (OuterVolumeSpecName: "manifests") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:23:12.469374 master-0 kubenswrapper[13046]: I0308 03:23:12.469159 13046 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 03:23:14.126023 master-0 kubenswrapper[13046]: I0308 03:23:14.125968 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899242a15b2bdf3b4a04fb323647ca94" path="/var/lib/kubelet/pods/899242a15b2bdf3b4a04fb323647ca94/volumes" Mar 08 03:23:19.752352 master-0 kubenswrapper[13046]: I0308 03:23:19.752262 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-twhrj" Mar 08 03:23:21.118271 master-0 kubenswrapper[13046]: I0308 03:23:21.118203 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:23:21.119268 master-0 kubenswrapper[13046]: E0308 03:23:21.118619 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:23:21.352447 master-0 kubenswrapper[13046]: I0308 03:23:21.352379 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:23:35.118718 master-0 kubenswrapper[13046]: I0308 03:23:35.118658 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:23:35.119614 master-0 kubenswrapper[13046]: E0308 03:23:35.119039 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:23:46.116899 master-0 kubenswrapper[13046]: I0308 03:23:46.116781 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tvw7c" Mar 08 03:23:46.119020 master-0 kubenswrapper[13046]: I0308 03:23:46.118773 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:23:46.119303 master-0 kubenswrapper[13046]: E0308 03:23:46.119188 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:23:49.360068 master-0 kubenswrapper[13046]: I0308 03:23:49.359975 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rdhz7" Mar 08 03:23:50.508541 master-0 kubenswrapper[13046]: I0308 03:23:50.508470 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:24:01.118629 master-0 kubenswrapper[13046]: I0308 03:24:01.118580 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:24:01.119229 master-0 kubenswrapper[13046]: E0308 03:24:01.118874 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:24:04.635115 master-0 kubenswrapper[13046]: I0308 03:24:04.635035 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:24:04.635991 master-0 kubenswrapper[13046]: I0308 03:24:04.635126 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:24:13.118372 master-0 kubenswrapper[13046]: I0308 03:24:13.118295 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:24:13.119314 master-0 kubenswrapper[13046]: E0308 03:24:13.118587 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:24:26.118304 master-0 kubenswrapper[13046]: I0308 03:24:26.118236 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:24:26.121177 master-0 kubenswrapper[13046]: E0308 03:24:26.118581 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:24:26.360088 master-0 kubenswrapper[13046]: I0308 03:24:26.360005 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: E0308 03:24:26.360255 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: I0308 03:24:26.360266 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: E0308 03:24:26.360280 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" containerName="installer" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: I0308 03:24:26.360286 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" containerName="installer" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: E0308 03:24:26.360305 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" containerName="installer" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: I0308 03:24:26.360311 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" containerName="installer" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: E0308 03:24:26.360318 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" containerName="installer" Mar 08 03:24:26.360362 master-0 kubenswrapper[13046]: I0308 03:24:26.360323 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" containerName="installer" Mar 08 03:24:26.360840 master-0 kubenswrapper[13046]: I0308 03:24:26.360417 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3035f9a-535f-4d1a-b3a0-02e2511894ff" containerName="installer" Mar 08 03:24:26.360840 master-0 kubenswrapper[13046]: I0308 03:24:26.360426 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb9745-3670-4b36-86fa-9da2aad5a9d4" containerName="installer" Mar 08 03:24:26.360840 master-0 kubenswrapper[13046]: I0308 03:24:26.360444 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 08 03:24:26.360840 master-0 kubenswrapper[13046]: I0308 03:24:26.360453 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a4e339f-ab04-4900-bf34-e683a2ed0eff" containerName="installer" Mar 08 03:24:26.360840 master-0 kubenswrapper[13046]: I0308 03:24:26.360814 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.363126 master-0 kubenswrapper[13046]: I0308 03:24:26.363072 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:24:26.363641 master-0 kubenswrapper[13046]: I0308 03:24:26.363599 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ks2rl" Mar 08 03:24:26.374140 master-0 kubenswrapper[13046]: I0308 03:24:26.374098 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:24:26.458033 master-0 kubenswrapper[13046]: I0308 03:24:26.457975 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.458252 master-0 kubenswrapper[13046]: I0308 03:24:26.458045 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.458252 master-0 kubenswrapper[13046]: I0308 03:24:26.458070 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.558988 master-0 kubenswrapper[13046]: I0308 03:24:26.558896 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.559211 master-0 kubenswrapper[13046]: I0308 03:24:26.558997 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.559211 master-0 kubenswrapper[13046]: I0308 03:24:26.559043 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.559211 master-0 kubenswrapper[13046]: I0308 03:24:26.559063 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.559469 master-0 kubenswrapper[13046]: I0308 03:24:26.559389 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.582321 master-0 kubenswrapper[13046]: I0308 03:24:26.581937 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:26.675691 master-0 kubenswrapper[13046]: I0308 03:24:26.675562 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:24:27.100340 master-0 kubenswrapper[13046]: I0308 03:24:27.100285 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 08 03:24:27.106641 master-0 kubenswrapper[13046]: W0308 03:24:27.106581 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod06940ded_ef9e_4661_921e_e635f5bc9ef5.slice/crio-49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934 WatchSource:0}: Error finding container 49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934: Status 404 returned error can't find the container with id 49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934 Mar 08 03:24:27.906415 master-0 kubenswrapper[13046]: I0308 03:24:27.906250 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"06940ded-ef9e-4661-921e-e635f5bc9ef5","Type":"ContainerStarted","Data":"e237b8ac2c372c642146575a3cd41192f3ddd5dcc59d65f2bc4ed5373ff9b2ca"} Mar 08 03:24:27.906415 master-0 kubenswrapper[13046]: I0308 03:24:27.906321 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"06940ded-ef9e-4661-921e-e635f5bc9ef5","Type":"ContainerStarted","Data":"49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934"} Mar 08 03:24:27.941157 master-0 kubenswrapper[13046]: I0308 03:24:27.941073 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=1.94104287 podStartE2EDuration="1.94104287s" podCreationTimestamp="2026-03-08 03:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:24:27.930709277 +0000 UTC m=+670.009476544" watchObservedRunningTime="2026-03-08 03:24:27.94104287 +0000 UTC m=+670.019810117" Mar 08 03:24:34.635443 master-0 kubenswrapper[13046]: I0308 03:24:34.635331 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:24:34.635443 master-0 kubenswrapper[13046]: I0308 03:24:34.635420 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:24:35.781308 master-0 kubenswrapper[13046]: I0308 03:24:35.781227 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-3-master-0"] Mar 08 03:24:35.783400 master-0 kubenswrapper[13046]: I0308 03:24:35.783362 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:35.790179 master-0 kubenswrapper[13046]: I0308 03:24:35.789974 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-gsmhw" Mar 08 03:24:35.791880 master-0 kubenswrapper[13046]: I0308 03:24:35.790975 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:24:35.801668 master-0 kubenswrapper[13046]: I0308 03:24:35.801595 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-3-master-0"] Mar 08 03:24:35.905524 master-0 kubenswrapper[13046]: I0308 03:24:35.902801 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:35.905524 master-0 kubenswrapper[13046]: I0308 03:24:35.902989 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:35.905524 master-0 kubenswrapper[13046]: I0308 03:24:35.903037 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.004206 master-0 kubenswrapper[13046]: I0308 03:24:36.004129 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.004355 master-0 kubenswrapper[13046]: I0308 03:24:36.004276 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.004355 master-0 kubenswrapper[13046]: I0308 03:24:36.004325 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.004722 master-0 kubenswrapper[13046]: I0308 03:24:36.004579 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.004722 master-0 kubenswrapper[13046]: I0308 03:24:36.004692 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.026267 master-0 kubenswrapper[13046]: I0308 03:24:36.026197 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access\") pod \"installer-4-retry-3-master-0\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.131179 master-0 kubenswrapper[13046]: I0308 03:24:36.129280 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:24:36.645049 master-0 kubenswrapper[13046]: I0308 03:24:36.644979 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-3-master-0"] Mar 08 03:24:36.971246 master-0 kubenswrapper[13046]: I0308 03:24:36.971083 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" event={"ID":"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd","Type":"ContainerStarted","Data":"d0448c5acb29b6f824d7da3d526803bd48dc582665b17671b3696850c87a92f1"} Mar 08 03:24:37.985031 master-0 kubenswrapper[13046]: I0308 03:24:37.984966 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" event={"ID":"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd","Type":"ContainerStarted","Data":"1ce6ee96c7df4a201555e9bc43eeadec063afc2eb7b89e80cbcd76d111ff9607"} Mar 08 03:24:38.018782 master-0 kubenswrapper[13046]: I0308 03:24:38.018620 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" podStartSLOduration=3.018469911 podStartE2EDuration="3.018469911s" podCreationTimestamp="2026-03-08 03:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:24:38.015392924 +0000 UTC m=+680.094160181" watchObservedRunningTime="2026-03-08 03:24:38.018469911 +0000 UTC m=+680.097237188" Mar 08 03:24:38.125018 master-0 kubenswrapper[13046]: I0308 03:24:38.124934 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:24:38.125411 master-0 kubenswrapper[13046]: E0308 03:24:38.125351 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:24:52.118314 master-0 kubenswrapper[13046]: I0308 03:24:52.118204 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:24:52.119456 master-0 kubenswrapper[13046]: E0308 03:24:52.118604 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 08 03:25:00.543546 master-0 kubenswrapper[13046]: I0308 03:25:00.543421 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:25:00.544664 master-0 kubenswrapper[13046]: I0308 03:25:00.543790 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://a9e780a9eb2bd513e2633ca7eb901eb0af2ab3ee1ed5af3a95bd9d57edb15b71" gracePeriod=30 Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.545942 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546307 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546329 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546350 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546363 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546382 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546396 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546409 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546421 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546461 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546473 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.546519 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546532 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546734 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546754 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546767 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546792 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546812 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546842 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.546859 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.547050 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547068 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.547085 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547097 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.547111 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547123 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: E0308 03:25:00.547160 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547172 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547345 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.547521 master-0 kubenswrapper[13046]: I0308 03:25:00.547383 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.549972 master-0 kubenswrapper[13046]: I0308 03:25:00.547787 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 08 03:25:00.549972 master-0 kubenswrapper[13046]: I0308 03:25:00.549081 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.645526 master-0 kubenswrapper[13046]: I0308 03:25:00.645423 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.645721 master-0 kubenswrapper[13046]: I0308 03:25:00.645540 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.747276 master-0 kubenswrapper[13046]: I0308 03:25:00.747168 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.747276 master-0 kubenswrapper[13046]: I0308 03:25:00.747249 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.747694 master-0 kubenswrapper[13046]: I0308 03:25:00.747402 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.747694 master-0 kubenswrapper[13046]: I0308 03:25:00.747471 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.779477 master-0 kubenswrapper[13046]: I0308 03:25:00.771053 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:25:00.781232 master-0 kubenswrapper[13046]: I0308 03:25:00.781130 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:00.798349 master-0 kubenswrapper[13046]: I0308 03:25:00.797208 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:25:00.838127 master-0 kubenswrapper[13046]: I0308 03:25:00.837278 13046 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a46f877b-f3b9-40f2-8090-946651f4cb9e" Mar 08 03:25:00.848098 master-0 kubenswrapper[13046]: I0308 03:25:00.848048 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:25:00.848442 master-0 kubenswrapper[13046]: I0308 03:25:00.848412 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:25:00.848691 master-0 kubenswrapper[13046]: I0308 03:25:00.848661 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:25:00.848872 master-0 kubenswrapper[13046]: I0308 03:25:00.848848 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:25:00.849027 master-0 kubenswrapper[13046]: I0308 03:25:00.849003 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 08 03:25:00.849601 master-0 kubenswrapper[13046]: I0308 03:25:00.849572 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:00.849783 master-0 kubenswrapper[13046]: I0308 03:25:00.849760 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:00.849937 master-0 kubenswrapper[13046]: I0308 03:25:00.849910 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:00.850090 master-0 kubenswrapper[13046]: I0308 03:25:00.850066 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:00.850242 master-0 kubenswrapper[13046]: I0308 03:25:00.850220 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:00.950660 master-0 kubenswrapper[13046]: I0308 03:25:00.950622 13046 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:00.950891 master-0 kubenswrapper[13046]: I0308 03:25:00.950857 13046 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:00.950959 master-0 kubenswrapper[13046]: I0308 03:25:00.950949 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:00.951023 master-0 kubenswrapper[13046]: I0308 03:25:00.951013 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:00.951081 master-0 kubenswrapper[13046]: I0308 03:25:00.951072 13046 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:01.172641 master-0 kubenswrapper[13046]: I0308 03:25:01.172558 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b63b5185e5bc481b891676a634cb5625","Type":"ContainerStarted","Data":"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9"} Mar 08 03:25:01.172783 master-0 kubenswrapper[13046]: I0308 03:25:01.172645 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b63b5185e5bc481b891676a634cb5625","Type":"ContainerStarted","Data":"6260c054d6ac2ebd404d1cf0b0bdb89c8f9afcae6767266baaa0269137ddd984"} Mar 08 03:25:01.174217 master-0 kubenswrapper[13046]: I0308 03:25:01.174146 13046 generic.go:334] "Generic (PLEG): container finished" podID="06940ded-ef9e-4661-921e-e635f5bc9ef5" containerID="e237b8ac2c372c642146575a3cd41192f3ddd5dcc59d65f2bc4ed5373ff9b2ca" exitCode=0 Mar 08 03:25:01.174344 master-0 kubenswrapper[13046]: I0308 03:25:01.174229 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"06940ded-ef9e-4661-921e-e635f5bc9ef5","Type":"ContainerDied","Data":"e237b8ac2c372c642146575a3cd41192f3ddd5dcc59d65f2bc4ed5373ff9b2ca"} Mar 08 03:25:01.178443 master-0 kubenswrapper[13046]: I0308 03:25:01.178355 13046 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a9e780a9eb2bd513e2633ca7eb901eb0af2ab3ee1ed5af3a95bd9d57edb15b71" exitCode=0 Mar 08 03:25:01.178443 master-0 kubenswrapper[13046]: I0308 03:25:01.178440 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a51959f59f0e4d24bb3dbdaf5b9fc41467ca4227d5fe04d7315090ee15288e0" Mar 08 03:25:01.178813 master-0 kubenswrapper[13046]: I0308 03:25:01.178468 13046 scope.go:117] "RemoveContainer" containerID="5cf51539374bcfe72a242f1e53596d9c98c86b64c9179b7354efb8ce2765e3ca" Mar 08 03:25:01.178813 master-0 kubenswrapper[13046]: I0308 03:25:01.178693 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 08 03:25:01.277158 master-0 kubenswrapper[13046]: I0308 03:25:01.275937 13046 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a46f877b-f3b9-40f2-8090-946651f4cb9e" Mar 08 03:25:02.131186 master-0 kubenswrapper[13046]: I0308 03:25:02.131105 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 08 03:25:02.132024 master-0 kubenswrapper[13046]: I0308 03:25:02.131981 13046 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 08 03:25:02.162534 master-0 kubenswrapper[13046]: I0308 03:25:02.162410 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:25:02.162534 master-0 kubenswrapper[13046]: I0308 03:25:02.162475 13046 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a46f877b-f3b9-40f2-8090-946651f4cb9e" Mar 08 03:25:02.180142 master-0 kubenswrapper[13046]: I0308 03:25:02.176529 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 08 03:25:02.180142 master-0 kubenswrapper[13046]: I0308 03:25:02.176594 13046 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a46f877b-f3b9-40f2-8090-946651f4cb9e" Mar 08 03:25:02.199216 master-0 kubenswrapper[13046]: I0308 03:25:02.199110 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b63b5185e5bc481b891676a634cb5625","Type":"ContainerStarted","Data":"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5"} Mar 08 03:25:02.199216 master-0 kubenswrapper[13046]: I0308 03:25:02.199219 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b63b5185e5bc481b891676a634cb5625","Type":"ContainerStarted","Data":"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a"} Mar 08 03:25:02.199552 master-0 kubenswrapper[13046]: I0308 03:25:02.199238 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b63b5185e5bc481b891676a634cb5625","Type":"ContainerStarted","Data":"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c"} Mar 08 03:25:02.229214 master-0 kubenswrapper[13046]: I0308 03:25:02.229065 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.22903717 podStartE2EDuration="2.22903717s" podCreationTimestamp="2026-03-08 03:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:02.224399128 +0000 UTC m=+704.303166345" watchObservedRunningTime="2026-03-08 03:25:02.22903717 +0000 UTC m=+704.307804407" Mar 08 03:25:02.516834 master-0 kubenswrapper[13046]: I0308 03:25:02.516703 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:25:02.672467 master-0 kubenswrapper[13046]: I0308 03:25:02.672363 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access\") pod \"06940ded-ef9e-4661-921e-e635f5bc9ef5\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " Mar 08 03:25:02.672759 master-0 kubenswrapper[13046]: I0308 03:25:02.672512 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir\") pod \"06940ded-ef9e-4661-921e-e635f5bc9ef5\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " Mar 08 03:25:02.672759 master-0 kubenswrapper[13046]: I0308 03:25:02.672569 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock\") pod \"06940ded-ef9e-4661-921e-e635f5bc9ef5\" (UID: \"06940ded-ef9e-4661-921e-e635f5bc9ef5\") " Mar 08 03:25:02.672759 master-0 kubenswrapper[13046]: I0308 03:25:02.672643 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06940ded-ef9e-4661-921e-e635f5bc9ef5" (UID: "06940ded-ef9e-4661-921e-e635f5bc9ef5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:02.672759 master-0 kubenswrapper[13046]: I0308 03:25:02.672713 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock" (OuterVolumeSpecName: "var-lock") pod "06940ded-ef9e-4661-921e-e635f5bc9ef5" (UID: "06940ded-ef9e-4661-921e-e635f5bc9ef5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:02.673041 master-0 kubenswrapper[13046]: I0308 03:25:02.672882 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:02.673041 master-0 kubenswrapper[13046]: I0308 03:25:02.672896 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/06940ded-ef9e-4661-921e-e635f5bc9ef5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:02.676707 master-0 kubenswrapper[13046]: I0308 03:25:02.676633 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06940ded-ef9e-4661-921e-e635f5bc9ef5" (UID: "06940ded-ef9e-4661-921e-e635f5bc9ef5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:02.773906 master-0 kubenswrapper[13046]: I0308 03:25:02.773784 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06940ded-ef9e-4661-921e-e635f5bc9ef5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:03.211402 master-0 kubenswrapper[13046]: I0308 03:25:03.211307 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"06940ded-ef9e-4661-921e-e635f5bc9ef5","Type":"ContainerDied","Data":"49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934"} Mar 08 03:25:03.211402 master-0 kubenswrapper[13046]: I0308 03:25:03.211387 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d088cafb48506e88ad580b2432b0ec6a24f9cb5f79619713e7bcb5b7df7934" Mar 08 03:25:03.211402 master-0 kubenswrapper[13046]: I0308 03:25:03.211329 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 08 03:25:04.634459 master-0 kubenswrapper[13046]: I0308 03:25:04.634381 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:25:04.635284 master-0 kubenswrapper[13046]: I0308 03:25:04.634463 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:25:04.635284 master-0 kubenswrapper[13046]: I0308 03:25:04.634564 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" Mar 08 03:25:04.635423 master-0 kubenswrapper[13046]: I0308 03:25:04.635353 13046 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d49efaffbe78e0cb01e3cafa2637d00f74bc46a5f970b64c00fdc01f5a452ba"} pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 03:25:04.635530 master-0 kubenswrapper[13046]: I0308 03:25:04.635453 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" containerID="cri-o://9d49efaffbe78e0cb01e3cafa2637d00f74bc46a5f970b64c00fdc01f5a452ba" gracePeriod=600 Mar 08 03:25:05.228531 master-0 kubenswrapper[13046]: I0308 03:25:05.228400 13046 generic.go:334] "Generic (PLEG): container finished" podID="1092f2a6-865c-4706-bba7-068621e85ebc" containerID="9d49efaffbe78e0cb01e3cafa2637d00f74bc46a5f970b64c00fdc01f5a452ba" exitCode=0 Mar 08 03:25:05.228531 master-0 kubenswrapper[13046]: I0308 03:25:05.228470 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerDied","Data":"9d49efaffbe78e0cb01e3cafa2637d00f74bc46a5f970b64c00fdc01f5a452ba"} Mar 08 03:25:05.228755 master-0 kubenswrapper[13046]: I0308 03:25:05.228550 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" event={"ID":"1092f2a6-865c-4706-bba7-068621e85ebc","Type":"ContainerStarted","Data":"da112ca0720e00375626c44eb1f2627cc39e5ad90a08136dc15830d87d7b2495"} Mar 08 03:25:05.228755 master-0 kubenswrapper[13046]: I0308 03:25:05.228581 13046 scope.go:117] "RemoveContainer" containerID="519bc3beb14de1a649f5b4efc69449f7665f68f38bd11235ec05e6e67ad8ad4d" Mar 08 03:25:08.280156 master-0 kubenswrapper[13046]: I0308 03:25:08.280046 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:25:08.281310 master-0 kubenswrapper[13046]: I0308 03:25:08.280371 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://94498e732862075a2e7db935be27489aa50ccc49f721fd0c7a11e45e6a5920c8" gracePeriod=30 Mar 08 03:25:08.282294 master-0 kubenswrapper[13046]: I0308 03:25:08.282206 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:25:08.282920 master-0 kubenswrapper[13046]: E0308 03:25:08.282848 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06940ded-ef9e-4661-921e-e635f5bc9ef5" containerName="installer" Mar 08 03:25:08.282920 master-0 kubenswrapper[13046]: I0308 03:25:08.282902 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="06940ded-ef9e-4661-921e-e635f5bc9ef5" containerName="installer" Mar 08 03:25:08.283111 master-0 kubenswrapper[13046]: E0308 03:25:08.282943 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283111 master-0 kubenswrapper[13046]: I0308 03:25:08.282963 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283111 master-0 kubenswrapper[13046]: E0308 03:25:08.283016 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283111 master-0 kubenswrapper[13046]: I0308 03:25:08.283036 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283111 master-0 kubenswrapper[13046]: E0308 03:25:08.283062 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283711 master-0 kubenswrapper[13046]: I0308 03:25:08.283078 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283805 master-0 kubenswrapper[13046]: I0308 03:25:08.283742 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283805 master-0 kubenswrapper[13046]: I0308 03:25:08.283780 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283986 master-0 kubenswrapper[13046]: I0308 03:25:08.283806 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.283986 master-0 kubenswrapper[13046]: I0308 03:25:08.283846 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="06940ded-ef9e-4661-921e-e635f5bc9ef5" containerName="installer" Mar 08 03:25:08.284539 master-0 kubenswrapper[13046]: E0308 03:25:08.284441 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.284692 master-0 kubenswrapper[13046]: I0308 03:25:08.284602 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.285153 master-0 kubenswrapper[13046]: I0308 03:25:08.285089 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 08 03:25:08.287383 master-0 kubenswrapper[13046]: I0308 03:25:08.287274 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.455768 master-0 kubenswrapper[13046]: I0308 03:25:08.455702 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.455768 master-0 kubenswrapper[13046]: I0308 03:25:08.455764 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.469005 master-0 kubenswrapper[13046]: I0308 03:25:08.468930 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:25:08.482412 master-0 kubenswrapper[13046]: I0308 03:25:08.482331 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:25:08.506751 master-0 kubenswrapper[13046]: I0308 03:25:08.506609 13046 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="a9ad9917-4f2f-428a-b63e-77cccea17134" Mar 08 03:25:08.557730 master-0 kubenswrapper[13046]: I0308 03:25:08.557532 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 03:25:08.557730 master-0 kubenswrapper[13046]: I0308 03:25:08.557643 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:08.557730 master-0 kubenswrapper[13046]: I0308 03:25:08.557712 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 08 03:25:08.558188 master-0 kubenswrapper[13046]: I0308 03:25:08.557793 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:08.558188 master-0 kubenswrapper[13046]: I0308 03:25:08.557951 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.558188 master-0 kubenswrapper[13046]: I0308 03:25:08.558133 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.558549 master-0 kubenswrapper[13046]: I0308 03:25:08.558205 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.558549 master-0 kubenswrapper[13046]: I0308 03:25:08.558280 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.558549 master-0 kubenswrapper[13046]: I0308 03:25:08.558299 13046 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:08.558549 master-0 kubenswrapper[13046]: I0308 03:25:08.558331 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:08.776811 master-0 kubenswrapper[13046]: I0308 03:25:08.776727 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:08.817304 master-0 kubenswrapper[13046]: W0308 03:25:08.816805 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3d45b6ce1b3764f9927e623a71adf8.slice/crio-2555f1b92822f2870135919975dbd486303fecca78d27947d144c55e2df2020c WatchSource:0}: Error finding container 2555f1b92822f2870135919975dbd486303fecca78d27947d144c55e2df2020c: Status 404 returned error can't find the container with id 2555f1b92822f2870135919975dbd486303fecca78d27947d144c55e2df2020c Mar 08 03:25:09.269506 master-0 kubenswrapper[13046]: I0308 03:25:09.269361 13046 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="33c664004c9c37147e8697c5db7face7d0516f1bc99f1c6f6db18ef4b8f7bcd8" exitCode=0 Mar 08 03:25:09.269506 master-0 kubenswrapper[13046]: I0308 03:25:09.269448 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerDied","Data":"33c664004c9c37147e8697c5db7face7d0516f1bc99f1c6f6db18ef4b8f7bcd8"} Mar 08 03:25:09.269929 master-0 kubenswrapper[13046]: I0308 03:25:09.269555 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"2555f1b92822f2870135919975dbd486303fecca78d27947d144c55e2df2020c"} Mar 08 03:25:09.273582 master-0 kubenswrapper[13046]: I0308 03:25:09.273471 13046 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="94498e732862075a2e7db935be27489aa50ccc49f721fd0c7a11e45e6a5920c8" exitCode=0 Mar 08 03:25:09.273732 master-0 kubenswrapper[13046]: I0308 03:25:09.273589 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 08 03:25:09.273814 master-0 kubenswrapper[13046]: I0308 03:25:09.273736 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a369c03bded3204a94d21dff5dbde8f62d4ea41e3566761a00a3a310a29778" Mar 08 03:25:09.273814 master-0 kubenswrapper[13046]: I0308 03:25:09.273609 13046 scope.go:117] "RemoveContainer" containerID="948a7b297afa3e564a934c704f69529deb5cbe59409e42dcc466920922ad1702" Mar 08 03:25:09.277720 master-0 kubenswrapper[13046]: I0308 03:25:09.277100 13046 generic.go:334] "Generic (PLEG): container finished" podID="f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" containerID="1ce6ee96c7df4a201555e9bc43eeadec063afc2eb7b89e80cbcd76d111ff9607" exitCode=0 Mar 08 03:25:09.277720 master-0 kubenswrapper[13046]: I0308 03:25:09.277161 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" event={"ID":"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd","Type":"ContainerDied","Data":"1ce6ee96c7df4a201555e9bc43eeadec063afc2eb7b89e80cbcd76d111ff9607"} Mar 08 03:25:10.141436 master-0 kubenswrapper[13046]: I0308 03:25:10.141346 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 08 03:25:10.142143 master-0 kubenswrapper[13046]: I0308 03:25:10.141890 13046 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 08 03:25:10.169108 master-0 kubenswrapper[13046]: I0308 03:25:10.169017 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:25:10.169297 master-0 kubenswrapper[13046]: I0308 03:25:10.169111 13046 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="a9ad9917-4f2f-428a-b63e-77cccea17134" Mar 08 03:25:10.177024 master-0 kubenswrapper[13046]: I0308 03:25:10.176945 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 08 03:25:10.177135 master-0 kubenswrapper[13046]: I0308 03:25:10.177021 13046 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="a9ad9917-4f2f-428a-b63e-77cccea17134" Mar 08 03:25:10.297552 master-0 kubenswrapper[13046]: I0308 03:25:10.297441 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"b162d8349d83460cb664c5872e401282175cb86df3f0012fb7fce29a941e6bca"} Mar 08 03:25:10.297552 master-0 kubenswrapper[13046]: I0308 03:25:10.297551 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"35834a4aa2c1aafe2e80cffe71b4934a09d612026d02ceb8d478e6578d08c89b"} Mar 08 03:25:10.297870 master-0 kubenswrapper[13046]: I0308 03:25:10.297580 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"85d08f576755b1f4982d207073aca843243efd692daf52094875911a38bb8b2f"} Mar 08 03:25:10.298127 master-0 kubenswrapper[13046]: I0308 03:25:10.298068 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:10.339304 master-0 kubenswrapper[13046]: I0308 03:25:10.339170 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.339107511 podStartE2EDuration="2.339107511s" podCreationTimestamp="2026-03-08 03:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:10.336073365 +0000 UTC m=+712.414840622" watchObservedRunningTime="2026-03-08 03:25:10.339107511 +0000 UTC m=+712.417874808" Mar 08 03:25:10.721972 master-0 kubenswrapper[13046]: I0308 03:25:10.721860 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:25:10.784073 master-0 kubenswrapper[13046]: I0308 03:25:10.784005 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.784272 master-0 kubenswrapper[13046]: I0308 03:25:10.784085 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.784272 master-0 kubenswrapper[13046]: I0308 03:25:10.784112 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.784272 master-0 kubenswrapper[13046]: I0308 03:25:10.784131 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.790141 master-0 kubenswrapper[13046]: I0308 03:25:10.790093 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.790803 master-0 kubenswrapper[13046]: I0308 03:25:10.790728 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:10.897503 master-0 kubenswrapper[13046]: I0308 03:25:10.896682 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access\") pod \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " Mar 08 03:25:10.897503 master-0 kubenswrapper[13046]: I0308 03:25:10.896782 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir\") pod \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " Mar 08 03:25:10.897503 master-0 kubenswrapper[13046]: I0308 03:25:10.896843 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock\") pod \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\" (UID: \"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd\") " Mar 08 03:25:10.897503 master-0 kubenswrapper[13046]: I0308 03:25:10.897256 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock" (OuterVolumeSpecName: "var-lock") pod "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" (UID: "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:10.906563 master-0 kubenswrapper[13046]: I0308 03:25:10.902116 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" (UID: "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:10.906563 master-0 kubenswrapper[13046]: I0308 03:25:10.902229 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" (UID: "f568986f-7c60-4c7d-bc3c-1a0113b1f0dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:10.998338 master-0 kubenswrapper[13046]: I0308 03:25:10.998201 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:10.998338 master-0 kubenswrapper[13046]: I0308 03:25:10.998240 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:10.998338 master-0 kubenswrapper[13046]: I0308 03:25:10.998256 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f568986f-7c60-4c7d-bc3c-1a0113b1f0dd-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:11.309457 master-0 kubenswrapper[13046]: I0308 03:25:11.309268 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" event={"ID":"f568986f-7c60-4c7d-bc3c-1a0113b1f0dd","Type":"ContainerDied","Data":"d0448c5acb29b6f824d7da3d526803bd48dc582665b17671b3696850c87a92f1"} Mar 08 03:25:11.309457 master-0 kubenswrapper[13046]: I0308 03:25:11.309345 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0448c5acb29b6f824d7da3d526803bd48dc582665b17671b3696850c87a92f1" Mar 08 03:25:11.309457 master-0 kubenswrapper[13046]: I0308 03:25:11.309385 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-3-master-0" Mar 08 03:25:11.318396 master-0 kubenswrapper[13046]: I0308 03:25:11.318345 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:11.319054 master-0 kubenswrapper[13046]: I0308 03:25:11.318419 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:25:18.443621 master-0 kubenswrapper[13046]: I0308 03:25:18.443460 13046 scope.go:117] "RemoveContainer" containerID="a9e780a9eb2bd513e2633ca7eb901eb0af2ab3ee1ed5af3a95bd9d57edb15b71" Mar 08 03:25:19.298192 master-0 kubenswrapper[13046]: I0308 03:25:19.296372 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-85drn"] Mar 08 03:25:19.298192 master-0 kubenswrapper[13046]: E0308 03:25:19.296638 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" containerName="installer" Mar 08 03:25:19.298192 master-0 kubenswrapper[13046]: I0308 03:25:19.296652 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" containerName="installer" Mar 08 03:25:19.298192 master-0 kubenswrapper[13046]: I0308 03:25:19.296754 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f568986f-7c60-4c7d-bc3c-1a0113b1f0dd" containerName="installer" Mar 08 03:25:19.298192 master-0 kubenswrapper[13046]: I0308 03:25:19.297134 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.303527 master-0 kubenswrapper[13046]: I0308 03:25:19.303191 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-m2ccx" Mar 08 03:25:19.309659 master-0 kubenswrapper[13046]: I0308 03:25:19.303756 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 03:25:19.419311 master-0 kubenswrapper[13046]: I0308 03:25:19.415058 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0119e83-0ee0-47e4-b591-6f2dc36073d2-serviceca\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.419311 master-0 kubenswrapper[13046]: I0308 03:25:19.415120 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0119e83-0ee0-47e4-b591-6f2dc36073d2-host\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.419311 master-0 kubenswrapper[13046]: I0308 03:25:19.415157 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qfj\" (UniqueName: \"kubernetes.io/projected/b0119e83-0ee0-47e4-b591-6f2dc36073d2-kube-api-access-l9qfj\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.424506 master-0 kubenswrapper[13046]: I0308 03:25:19.421492 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r"] Mar 08 03:25:19.424506 master-0 kubenswrapper[13046]: I0308 03:25:19.422364 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.424888 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zr868" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.424986 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.425107 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.425115 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.425127 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:25:19.425877 master-0 kubenswrapper[13046]: I0308 03:25:19.425353 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:25:19.475042 master-0 kubenswrapper[13046]: I0308 03:25:19.474987 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:25:19.475545 master-0 kubenswrapper[13046]: I0308 03:25:19.475195 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="multus-admission-controller" containerID="cri-o://bf7afb690bf11b8a7c9ce9f568adbdaaa57866a3aff5ced1711ca0a11620089f" gracePeriod=30 Mar 08 03:25:19.475545 master-0 kubenswrapper[13046]: I0308 03:25:19.475532 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="kube-rbac-proxy" containerID="cri-o://29c6ed5b13bfb915384e6141f8cbf16cba543eb6524f87e0bd97e324ceae1c63" gracePeriod=30 Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517270 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qfj\" (UniqueName: \"kubernetes.io/projected/b0119e83-0ee0-47e4-b591-6f2dc36073d2-kube-api-access-l9qfj\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517316 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517336 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a42a6a01-d8a7-4430-b919-904c41c875b1-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517365 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0119e83-0ee0-47e4-b591-6f2dc36073d2-serviceca\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517382 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517408 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb8xk\" (UniqueName: \"kubernetes.io/projected/a42a6a01-d8a7-4430-b919-904c41c875b1-kube-api-access-lb8xk\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517433 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0119e83-0ee0-47e4-b591-6f2dc36073d2-host\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.517503 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0119e83-0ee0-47e4-b591-6f2dc36073d2-host\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.520505 master-0 kubenswrapper[13046]: I0308 03:25:19.518424 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b0119e83-0ee0-47e4-b591-6f2dc36073d2-serviceca\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.548658 master-0 kubenswrapper[13046]: I0308 03:25:19.545135 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z"] Mar 08 03:25:19.548658 master-0 kubenswrapper[13046]: I0308 03:25:19.545415 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="kube-rbac-proxy" containerID="cri-o://a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" gracePeriod=30 Mar 08 03:25:19.548658 master-0 kubenswrapper[13046]: I0308 03:25:19.545459 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" containerID="cri-o://786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" gracePeriod=30 Mar 08 03:25:19.548658 master-0 kubenswrapper[13046]: I0308 03:25:19.546941 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" containerID="cri-o://cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" gracePeriod=30 Mar 08 03:25:19.567679 master-0 kubenswrapper[13046]: I0308 03:25:19.562210 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qfj\" (UniqueName: \"kubernetes.io/projected/b0119e83-0ee0-47e4-b591-6f2dc36073d2-kube-api-access-l9qfj\") pod \"node-ca-85drn\" (UID: \"b0119e83-0ee0-47e4-b591-6f2dc36073d2\") " pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.620364 master-0 kubenswrapper[13046]: I0308 03:25:19.618091 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.620364 master-0 kubenswrapper[13046]: I0308 03:25:19.618159 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb8xk\" (UniqueName: \"kubernetes.io/projected/a42a6a01-d8a7-4430-b919-904c41c875b1-kube-api-access-lb8xk\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.620364 master-0 kubenswrapper[13046]: I0308 03:25:19.618220 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.620364 master-0 kubenswrapper[13046]: I0308 03:25:19.618237 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a42a6a01-d8a7-4430-b919-904c41c875b1-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.622516 master-0 kubenswrapper[13046]: I0308 03:25:19.621276 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.622516 master-0 kubenswrapper[13046]: I0308 03:25:19.621738 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a42a6a01-d8a7-4430-b919-904c41c875b1-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.627529 master-0 kubenswrapper[13046]: I0308 03:25:19.624623 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-flsjg"] Mar 08 03:25:19.627529 master-0 kubenswrapper[13046]: I0308 03:25:19.625000 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a42a6a01-d8a7-4430-b919-904c41c875b1-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.627529 master-0 kubenswrapper[13046]: I0308 03:25:19.625355 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.642506 master-0 kubenswrapper[13046]: I0308 03:25:19.639283 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 03:25:19.642506 master-0 kubenswrapper[13046]: I0308 03:25:19.639499 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 03:25:19.650042 master-0 kubenswrapper[13046]: I0308 03:25:19.645553 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 03:25:19.650042 master-0 kubenswrapper[13046]: I0308 03:25:19.645734 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-kfwcc" Mar 08 03:25:19.650042 master-0 kubenswrapper[13046]: I0308 03:25:19.645862 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 03:25:19.650042 master-0 kubenswrapper[13046]: I0308 03:25:19.649785 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 03:25:19.661264 master-0 kubenswrapper[13046]: I0308 03:25:19.660048 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb8xk\" (UniqueName: \"kubernetes.io/projected/a42a6a01-d8a7-4430-b919-904c41c875b1-kube-api-access-lb8xk\") pod \"machine-approver-754bdc9f9d-rmg6r\" (UID: \"a42a6a01-d8a7-4430-b919-904c41c875b1\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.661928 master-0 kubenswrapper[13046]: I0308 03:25:19.661905 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85drn" Mar 08 03:25:19.700579 master-0 kubenswrapper[13046]: I0308 03:25:19.700314 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-flsjg"] Mar 08 03:25:19.706870 master-0 kubenswrapper[13046]: W0308 03:25:19.706825 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0119e83_0ee0_47e4_b591_6f2dc36073d2.slice/crio-f339263692b55711f5c8f16bb92b4ee6fce17f2e7aa5d683851aaf44fc3f2a30 WatchSource:0}: Error finding container f339263692b55711f5c8f16bb92b4ee6fce17f2e7aa5d683851aaf44fc3f2a30: Status 404 returned error can't find the container with id f339263692b55711f5c8f16bb92b4ee6fce17f2e7aa5d683851aaf44fc3f2a30 Mar 08 03:25:19.708935 master-0 kubenswrapper[13046]: I0308 03:25:19.708706 13046 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:25:19.715788 master-0 kubenswrapper[13046]: I0308 03:25:19.715744 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/config-sync-controllers/0.log" Mar 08 03:25:19.716548 master-0 kubenswrapper[13046]: I0308 03:25:19.716517 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/cluster-cloud-controller-manager/0.log" Mar 08 03:25:19.717303 master-0 kubenswrapper[13046]: I0308 03:25:19.716613 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:25:19.719084 master-0 kubenswrapper[13046]: I0308 03:25:19.719056 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec752a2e-4b18-4f4d-af88-19594345ae1c-serving-cert\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.719160 master-0 kubenswrapper[13046]: I0308 03:25:19.719106 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-trusted-ca\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.719160 master-0 kubenswrapper[13046]: I0308 03:25:19.719152 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6m9l\" (UniqueName: \"kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.719233 master-0 kubenswrapper[13046]: I0308 03:25:19.719189 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-config\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.749184 master-0 kubenswrapper[13046]: I0308 03:25:19.749122 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" Mar 08 03:25:19.773316 master-0 kubenswrapper[13046]: W0308 03:25:19.773266 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42a6a01_d8a7_4430_b919_904c41c875b1.slice/crio-68217f423a84b37ee037a98812063acdebc0a4750d75a3e01cf7e31574938696 WatchSource:0}: Error finding container 68217f423a84b37ee037a98812063acdebc0a4750d75a3e01cf7e31574938696: Status 404 returned error can't find the container with id 68217f423a84b37ee037a98812063acdebc0a4750d75a3e01cf7e31574938696 Mar 08 03:25:19.820081 master-0 kubenswrapper[13046]: I0308 03:25:19.820026 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") pod \"52836130-d42e-495c-adbf-19ff9a393347\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " Mar 08 03:25:19.820081 master-0 kubenswrapper[13046]: I0308 03:25:19.820113 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") pod \"52836130-d42e-495c-adbf-19ff9a393347\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " Mar 08 03:25:19.820300 master-0 kubenswrapper[13046]: I0308 03:25:19.820154 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") pod \"52836130-d42e-495c-adbf-19ff9a393347\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " Mar 08 03:25:19.820300 master-0 kubenswrapper[13046]: I0308 03:25:19.820263 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") pod \"52836130-d42e-495c-adbf-19ff9a393347\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " Mar 08 03:25:19.820372 master-0 kubenswrapper[13046]: I0308 03:25:19.820302 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") pod \"52836130-d42e-495c-adbf-19ff9a393347\" (UID: \"52836130-d42e-495c-adbf-19ff9a393347\") " Mar 08 03:25:19.820505 master-0 kubenswrapper[13046]: I0308 03:25:19.820456 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec752a2e-4b18-4f4d-af88-19594345ae1c-serving-cert\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.820556 master-0 kubenswrapper[13046]: I0308 03:25:19.820522 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-trusted-ca\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.820592 master-0 kubenswrapper[13046]: I0308 03:25:19.820579 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6m9l\" (UniqueName: \"kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.822350 master-0 kubenswrapper[13046]: I0308 03:25:19.820626 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-config\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.822350 master-0 kubenswrapper[13046]: I0308 03:25:19.821413 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-config\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.822350 master-0 kubenswrapper[13046]: I0308 03:25:19.821802 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images" (OuterVolumeSpecName: "images") pod "52836130-d42e-495c-adbf-19ff9a393347" (UID: "52836130-d42e-495c-adbf-19ff9a393347"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:19.822604 master-0 kubenswrapper[13046]: I0308 03:25:19.822566 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "52836130-d42e-495c-adbf-19ff9a393347" (UID: "52836130-d42e-495c-adbf-19ff9a393347"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:25:19.822978 master-0 kubenswrapper[13046]: I0308 03:25:19.822948 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "52836130-d42e-495c-adbf-19ff9a393347" (UID: "52836130-d42e-495c-adbf-19ff9a393347"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:19.824528 master-0 kubenswrapper[13046]: I0308 03:25:19.823820 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec752a2e-4b18-4f4d-af88-19594345ae1c-trusted-ca\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.828330 master-0 kubenswrapper[13046]: I0308 03:25:19.828300 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt" (OuterVolumeSpecName: "kube-api-access-mdxtt") pod "52836130-d42e-495c-adbf-19ff9a393347" (UID: "52836130-d42e-495c-adbf-19ff9a393347"). InnerVolumeSpecName "kube-api-access-mdxtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:19.832153 master-0 kubenswrapper[13046]: I0308 03:25:19.832109 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "52836130-d42e-495c-adbf-19ff9a393347" (UID: "52836130-d42e-495c-adbf-19ff9a393347"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:25:19.832664 master-0 kubenswrapper[13046]: I0308 03:25:19.832626 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec752a2e-4b18-4f4d-af88-19594345ae1c-serving-cert\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:19.835853 master-0 kubenswrapper[13046]: E0308 03:25:19.835831 13046 projected.go:288] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 08 03:25:19.835952 master-0 kubenswrapper[13046]: E0308 03:25:19.835939 13046 projected.go:194] Error preparing data for projected volume kube-api-access-w6m9l for pod openshift-console-operator/console-operator-6c7fb6b958-flsjg: configmap "kube-root-ca.crt" not found Mar 08 03:25:19.836071 master-0 kubenswrapper[13046]: E0308 03:25:19.836061 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l podName:ec752a2e-4b18-4f4d-af88-19594345ae1c nodeName:}" failed. No retries permitted until 2026-03-08 03:25:20.336036759 +0000 UTC m=+722.414803966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w6m9l" (UniqueName: "kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l") pod "console-operator-6c7fb6b958-flsjg" (UID: "ec752a2e-4b18-4f4d-af88-19594345ae1c") : configmap "kube-root-ca.crt" not found Mar 08 03:25:19.922454 master-0 kubenswrapper[13046]: I0308 03:25:19.921666 13046 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/52836130-d42e-495c-adbf-19ff9a393347-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:19.922454 master-0 kubenswrapper[13046]: I0308 03:25:19.921746 13046 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/52836130-d42e-495c-adbf-19ff9a393347-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:19.922454 master-0 kubenswrapper[13046]: I0308 03:25:19.921761 13046 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-images\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:19.922454 master-0 kubenswrapper[13046]: I0308 03:25:19.921771 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdxtt\" (UniqueName: \"kubernetes.io/projected/52836130-d42e-495c-adbf-19ff9a393347-kube-api-access-mdxtt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:19.922454 master-0 kubenswrapper[13046]: I0308 03:25:19.921783 13046 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52836130-d42e-495c-adbf-19ff9a393347-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:20.437668 master-0 kubenswrapper[13046]: I0308 03:25:20.437051 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6m9l\" (UniqueName: \"kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:20.446328 master-0 kubenswrapper[13046]: I0308 03:25:20.446244 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6m9l\" (UniqueName: \"kubernetes.io/projected/ec752a2e-4b18-4f4d-af88-19594345ae1c-kube-api-access-w6m9l\") pod \"console-operator-6c7fb6b958-flsjg\" (UID: \"ec752a2e-4b18-4f4d-af88-19594345ae1c\") " pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:20.462368 master-0 kubenswrapper[13046]: I0308 03:25:20.452806 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" event={"ID":"a42a6a01-d8a7-4430-b919-904c41c875b1","Type":"ContainerStarted","Data":"1fb82ccc56af3f2c56b94731064f6b9b2c43793bfac7522706165be333786150"} Mar 08 03:25:20.462368 master-0 kubenswrapper[13046]: I0308 03:25:20.452854 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" event={"ID":"a42a6a01-d8a7-4430-b919-904c41c875b1","Type":"ContainerStarted","Data":"358cb1e4ad8caae96d1a28fb9a317ea4fbb06c81cdc40ce50c90415582a4d47e"} Mar 08 03:25:20.462368 master-0 kubenswrapper[13046]: I0308 03:25:20.452864 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" event={"ID":"a42a6a01-d8a7-4430-b919-904c41c875b1","Type":"ContainerStarted","Data":"68217f423a84b37ee037a98812063acdebc0a4750d75a3e01cf7e31574938696"} Mar 08 03:25:20.462368 master-0 kubenswrapper[13046]: I0308 03:25:20.460030 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/config-sync-controllers/0.log" Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463072 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-kkm7z_52836130-d42e-495c-adbf-19ff9a393347/cluster-cloud-controller-manager/0.log" Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463128 13046 generic.go:334] "Generic (PLEG): container finished" podID="52836130-d42e-495c-adbf-19ff9a393347" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" exitCode=0 Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463150 13046 generic.go:334] "Generic (PLEG): container finished" podID="52836130-d42e-495c-adbf-19ff9a393347" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" exitCode=0 Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463168 13046 generic.go:334] "Generic (PLEG): container finished" podID="52836130-d42e-495c-adbf-19ff9a393347" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" exitCode=0 Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463248 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6"} Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463277 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f"} Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463289 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143"} Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463299 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" event={"ID":"52836130-d42e-495c-adbf-19ff9a393347","Type":"ContainerDied","Data":"bf0f6570fe84a3058c6d6122c2d052c6c8b6d42a5f14c4cbfb5452cbc6866dd1"} Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463316 13046 scope.go:117] "RemoveContainer" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" Mar 08 03:25:20.463880 master-0 kubenswrapper[13046]: I0308 03:25:20.463449 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z" Mar 08 03:25:20.467135 master-0 kubenswrapper[13046]: I0308 03:25:20.467098 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85drn" event={"ID":"b0119e83-0ee0-47e4-b591-6f2dc36073d2","Type":"ContainerStarted","Data":"f339263692b55711f5c8f16bb92b4ee6fce17f2e7aa5d683851aaf44fc3f2a30"} Mar 08 03:25:20.471230 master-0 kubenswrapper[13046]: I0308 03:25:20.469936 13046 generic.go:334] "Generic (PLEG): container finished" podID="23b66415-df37-4015-9a0c-69115b3a0739" containerID="29c6ed5b13bfb915384e6141f8cbf16cba543eb6524f87e0bd97e324ceae1c63" exitCode=0 Mar 08 03:25:20.471230 master-0 kubenswrapper[13046]: I0308 03:25:20.469975 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerDied","Data":"29c6ed5b13bfb915384e6141f8cbf16cba543eb6524f87e0bd97e324ceae1c63"} Mar 08 03:25:20.482431 master-0 kubenswrapper[13046]: I0308 03:25:20.481422 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-rmg6r" podStartSLOduration=1.481396987 podStartE2EDuration="1.481396987s" podCreationTimestamp="2026-03-08 03:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:20.472476374 +0000 UTC m=+722.551243611" watchObservedRunningTime="2026-03-08 03:25:20.481396987 +0000 UTC m=+722.560164214" Mar 08 03:25:20.488791 master-0 kubenswrapper[13046]: I0308 03:25:20.484655 13046 scope.go:117] "RemoveContainer" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" Mar 08 03:25:20.509083 master-0 kubenswrapper[13046]: I0308 03:25:20.509038 13046 scope.go:117] "RemoveContainer" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" Mar 08 03:25:20.512302 master-0 kubenswrapper[13046]: I0308 03:25:20.512252 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z"] Mar 08 03:25:20.520502 master-0 kubenswrapper[13046]: I0308 03:25:20.519311 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-kkm7z"] Mar 08 03:25:20.529918 master-0 kubenswrapper[13046]: I0308 03:25:20.529883 13046 scope.go:117] "RemoveContainer" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:25:20.545751 master-0 kubenswrapper[13046]: I0308 03:25:20.545675 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7"] Mar 08 03:25:20.545989 master-0 kubenswrapper[13046]: E0308 03:25:20.545953 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="kube-rbac-proxy" Mar 08 03:25:20.545989 master-0 kubenswrapper[13046]: I0308 03:25:20.545970 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="kube-rbac-proxy" Mar 08 03:25:20.545989 master-0 kubenswrapper[13046]: E0308 03:25:20.545988 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546000 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: E0308 03:25:20.546020 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546031 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: E0308 03:25:20.546050 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546059 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546198 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="kube-rbac-proxy" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546213 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.546220 master-0 kubenswrapper[13046]: I0308 03:25:20.546234 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.547215 master-0 kubenswrapper[13046]: I0308 03:25:20.546256 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="config-sync-controllers" Mar 08 03:25:20.547215 master-0 kubenswrapper[13046]: I0308 03:25:20.546274 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.547215 master-0 kubenswrapper[13046]: E0308 03:25:20.546422 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.547215 master-0 kubenswrapper[13046]: I0308 03:25:20.546433 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="52836130-d42e-495c-adbf-19ff9a393347" containerName="cluster-cloud-controller-manager" Mar 08 03:25:20.547515 master-0 kubenswrapper[13046]: I0308 03:25:20.547329 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.550973 master-0 kubenswrapper[13046]: I0308 03:25:20.550860 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:25:20.551189 master-0 kubenswrapper[13046]: I0308 03:25:20.551146 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:25:20.551475 master-0 kubenswrapper[13046]: I0308 03:25:20.551452 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:25:20.551717 master-0 kubenswrapper[13046]: I0308 03:25:20.551695 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-rkgwg" Mar 08 03:25:20.551914 master-0 kubenswrapper[13046]: I0308 03:25:20.551894 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:25:20.552136 master-0 kubenswrapper[13046]: I0308 03:25:20.552113 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:25:20.559833 master-0 kubenswrapper[13046]: I0308 03:25:20.559603 13046 scope.go:117] "RemoveContainer" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:25:20.586837 master-0 kubenswrapper[13046]: I0308 03:25:20.586769 13046 scope.go:117] "RemoveContainer" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" Mar 08 03:25:20.587512 master-0 kubenswrapper[13046]: E0308 03:25:20.587460 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": container with ID starting with 786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6 not found: ID does not exist" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" Mar 08 03:25:20.587629 master-0 kubenswrapper[13046]: I0308 03:25:20.587511 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6"} err="failed to get container status \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": rpc error: code = NotFound desc = could not find container \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": container with ID starting with 786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6 not found: ID does not exist" Mar 08 03:25:20.587629 master-0 kubenswrapper[13046]: I0308 03:25:20.587535 13046 scope.go:117] "RemoveContainer" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" Mar 08 03:25:20.587778 master-0 kubenswrapper[13046]: E0308 03:25:20.587761 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": container with ID starting with cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f not found: ID does not exist" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" Mar 08 03:25:20.587852 master-0 kubenswrapper[13046]: I0308 03:25:20.587778 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f"} err="failed to get container status \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": rpc error: code = NotFound desc = could not find container \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": container with ID starting with cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f not found: ID does not exist" Mar 08 03:25:20.587852 master-0 kubenswrapper[13046]: I0308 03:25:20.587792 13046 scope.go:117] "RemoveContainer" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" Mar 08 03:25:20.587996 master-0 kubenswrapper[13046]: E0308 03:25:20.587951 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": container with ID starting with a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143 not found: ID does not exist" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" Mar 08 03:25:20.587996 master-0 kubenswrapper[13046]: I0308 03:25:20.587965 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143"} err="failed to get container status \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": rpc error: code = NotFound desc = could not find container \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": container with ID starting with a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143 not found: ID does not exist" Mar 08 03:25:20.587996 master-0 kubenswrapper[13046]: I0308 03:25:20.587976 13046 scope.go:117] "RemoveContainer" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:25:20.588181 master-0 kubenswrapper[13046]: E0308 03:25:20.588147 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": container with ID starting with 7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d not found: ID does not exist" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:25:20.588181 master-0 kubenswrapper[13046]: I0308 03:25:20.588161 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d"} err="failed to get container status \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": rpc error: code = NotFound desc = could not find container \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": container with ID starting with 7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d not found: ID does not exist" Mar 08 03:25:20.588181 master-0 kubenswrapper[13046]: I0308 03:25:20.588173 13046 scope.go:117] "RemoveContainer" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:25:20.588359 master-0 kubenswrapper[13046]: E0308 03:25:20.588337 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": container with ID starting with 8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b not found: ID does not exist" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:25:20.588420 master-0 kubenswrapper[13046]: I0308 03:25:20.588360 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b"} err="failed to get container status \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": rpc error: code = NotFound desc = could not find container \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": container with ID starting with 8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b not found: ID does not exist" Mar 08 03:25:20.588420 master-0 kubenswrapper[13046]: I0308 03:25:20.588371 13046 scope.go:117] "RemoveContainer" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" Mar 08 03:25:20.588600 master-0 kubenswrapper[13046]: I0308 03:25:20.588553 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6"} err="failed to get container status \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": rpc error: code = NotFound desc = could not find container \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": container with ID starting with 786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6 not found: ID does not exist" Mar 08 03:25:20.588600 master-0 kubenswrapper[13046]: I0308 03:25:20.588576 13046 scope.go:117] "RemoveContainer" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" Mar 08 03:25:20.589837 master-0 kubenswrapper[13046]: I0308 03:25:20.589807 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f"} err="failed to get container status \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": rpc error: code = NotFound desc = could not find container \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": container with ID starting with cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f not found: ID does not exist" Mar 08 03:25:20.589837 master-0 kubenswrapper[13046]: I0308 03:25:20.589826 13046 scope.go:117] "RemoveContainer" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" Mar 08 03:25:20.590517 master-0 kubenswrapper[13046]: I0308 03:25:20.590472 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143"} err="failed to get container status \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": rpc error: code = NotFound desc = could not find container \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": container with ID starting with a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143 not found: ID does not exist" Mar 08 03:25:20.590517 master-0 kubenswrapper[13046]: I0308 03:25:20.590506 13046 scope.go:117] "RemoveContainer" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:25:20.590894 master-0 kubenswrapper[13046]: I0308 03:25:20.590847 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d"} err="failed to get container status \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": rpc error: code = NotFound desc = could not find container \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": container with ID starting with 7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d not found: ID does not exist" Mar 08 03:25:20.590894 master-0 kubenswrapper[13046]: I0308 03:25:20.590891 13046 scope.go:117] "RemoveContainer" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:25:20.591334 master-0 kubenswrapper[13046]: I0308 03:25:20.591303 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b"} err="failed to get container status \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": rpc error: code = NotFound desc = could not find container \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": container with ID starting with 8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b not found: ID does not exist" Mar 08 03:25:20.591334 master-0 kubenswrapper[13046]: I0308 03:25:20.591323 13046 scope.go:117] "RemoveContainer" containerID="786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6" Mar 08 03:25:20.591557 master-0 kubenswrapper[13046]: I0308 03:25:20.591507 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6"} err="failed to get container status \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": rpc error: code = NotFound desc = could not find container \"786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6\": container with ID starting with 786a6b0c7b7b39dcf229e68947bb7df65e9e2f37e415d695d06c395d3913bfa6 not found: ID does not exist" Mar 08 03:25:20.591557 master-0 kubenswrapper[13046]: I0308 03:25:20.591520 13046 scope.go:117] "RemoveContainer" containerID="cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f" Mar 08 03:25:20.591695 master-0 kubenswrapper[13046]: I0308 03:25:20.591652 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f"} err="failed to get container status \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": rpc error: code = NotFound desc = could not find container \"cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f\": container with ID starting with cbb4c081b3f5af1fe4ad8d2babe8b5b625170d93429a8144c4fb177a2ca1532f not found: ID does not exist" Mar 08 03:25:20.591695 master-0 kubenswrapper[13046]: I0308 03:25:20.591665 13046 scope.go:117] "RemoveContainer" containerID="a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143" Mar 08 03:25:20.591818 master-0 kubenswrapper[13046]: I0308 03:25:20.591798 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143"} err="failed to get container status \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": rpc error: code = NotFound desc = could not find container \"a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143\": container with ID starting with a10f3f3c2590ecebc4de2e1bca99e46f3f1c8c40ea052bed9de2ff1bc16be143 not found: ID does not exist" Mar 08 03:25:20.591818 master-0 kubenswrapper[13046]: I0308 03:25:20.591811 13046 scope.go:117] "RemoveContainer" containerID="7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d" Mar 08 03:25:20.591970 master-0 kubenswrapper[13046]: I0308 03:25:20.591939 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d"} err="failed to get container status \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": rpc error: code = NotFound desc = could not find container \"7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d\": container with ID starting with 7cc69cb85d2741414f1de7605fb7527416a02f877e0f246b9d2fd4e409eeab0d not found: ID does not exist" Mar 08 03:25:20.591970 master-0 kubenswrapper[13046]: I0308 03:25:20.591954 13046 scope.go:117] "RemoveContainer" containerID="8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b" Mar 08 03:25:20.592102 master-0 kubenswrapper[13046]: I0308 03:25:20.592081 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b"} err="failed to get container status \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": rpc error: code = NotFound desc = could not find container \"8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b\": container with ID starting with 8b97a2efd266e8ca3867573ba0486b32451b9099a517e14c3a85e5b3f61d332b not found: ID does not exist" Mar 08 03:25:20.597364 master-0 kubenswrapper[13046]: I0308 03:25:20.597321 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:20.641050 master-0 kubenswrapper[13046]: I0308 03:25:20.641003 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.641166 master-0 kubenswrapper[13046]: I0308 03:25:20.641061 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.641166 master-0 kubenswrapper[13046]: I0308 03:25:20.641096 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.641313 master-0 kubenswrapper[13046]: I0308 03:25:20.641252 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.641367 master-0 kubenswrapper[13046]: I0308 03:25:20.641323 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72kmp\" (UniqueName: \"kubernetes.io/projected/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-kube-api-access-72kmp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.744281 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.744341 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72kmp\" (UniqueName: \"kubernetes.io/projected/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-kube-api-access-72kmp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.744416 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.745023 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.745263 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.745704 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.746787 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.747142 master-0 kubenswrapper[13046]: I0308 03:25:20.746948 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.762903 master-0 kubenswrapper[13046]: I0308 03:25:20.761978 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.768549 master-0 kubenswrapper[13046]: I0308 03:25:20.768465 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72kmp\" (UniqueName: \"kubernetes.io/projected/d0931dbb-67f7-46d4-bc29-4dacdd9d1108-kube-api-access-72kmp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7\" (UID: \"d0931dbb-67f7-46d4-bc29-4dacdd9d1108\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.884470 master-0 kubenswrapper[13046]: I0308 03:25:20.884399 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" Mar 08 03:25:20.907785 master-0 kubenswrapper[13046]: W0308 03:25:20.907747 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0931dbb_67f7_46d4_bc29_4dacdd9d1108.slice/crio-8efe1f6095d675c4af5a2be4baeae45785a915ee9fcbc673c41ca745caada767 WatchSource:0}: Error finding container 8efe1f6095d675c4af5a2be4baeae45785a915ee9fcbc673c41ca745caada767: Status 404 returned error can't find the container with id 8efe1f6095d675c4af5a2be4baeae45785a915ee9fcbc673c41ca745caada767 Mar 08 03:25:20.995634 master-0 kubenswrapper[13046]: I0308 03:25:20.995453 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-flsjg"] Mar 08 03:25:21.000203 master-0 kubenswrapper[13046]: W0308 03:25:21.000160 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec752a2e_4b18_4f4d_af88_19594345ae1c.slice/crio-c99a11fc61a2c40a2b5360fc94eca41c81b7213bcd293a2f052507c09ac0649a WatchSource:0}: Error finding container c99a11fc61a2c40a2b5360fc94eca41c81b7213bcd293a2f052507c09ac0649a: Status 404 returned error can't find the container with id c99a11fc61a2c40a2b5360fc94eca41c81b7213bcd293a2f052507c09ac0649a Mar 08 03:25:21.480003 master-0 kubenswrapper[13046]: I0308 03:25:21.479689 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" event={"ID":"d0931dbb-67f7-46d4-bc29-4dacdd9d1108","Type":"ContainerStarted","Data":"f0dfa6c0c7dd22245a33165a736775df34c3f8dffb457c190aa0e89646066aba"} Mar 08 03:25:21.480003 master-0 kubenswrapper[13046]: I0308 03:25:21.479732 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" event={"ID":"d0931dbb-67f7-46d4-bc29-4dacdd9d1108","Type":"ContainerStarted","Data":"13bc40c525abf7de076f048467df8f55e7035b6bce374f5f2db6bc99d66c2400"} Mar 08 03:25:21.480003 master-0 kubenswrapper[13046]: I0308 03:25:21.479741 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" event={"ID":"d0931dbb-67f7-46d4-bc29-4dacdd9d1108","Type":"ContainerStarted","Data":"8efe1f6095d675c4af5a2be4baeae45785a915ee9fcbc673c41ca745caada767"} Mar 08 03:25:21.481406 master-0 kubenswrapper[13046]: I0308 03:25:21.481162 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" event={"ID":"ec752a2e-4b18-4f4d-af88-19594345ae1c","Type":"ContainerStarted","Data":"c99a11fc61a2c40a2b5360fc94eca41c81b7213bcd293a2f052507c09ac0649a"} Mar 08 03:25:22.126477 master-0 kubenswrapper[13046]: I0308 03:25:22.126419 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52836130-d42e-495c-adbf-19ff9a393347" path="/var/lib/kubelet/pods/52836130-d42e-495c-adbf-19ff9a393347/volumes" Mar 08 03:25:22.492162 master-0 kubenswrapper[13046]: I0308 03:25:22.492106 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85drn" event={"ID":"b0119e83-0ee0-47e4-b591-6f2dc36073d2","Type":"ContainerStarted","Data":"d682295d71139e3ca86ddd8bd51ee09feee38f73a27a70b8b2f29dd8f4263d2f"} Mar 08 03:25:22.496444 master-0 kubenswrapper[13046]: I0308 03:25:22.496392 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" event={"ID":"d0931dbb-67f7-46d4-bc29-4dacdd9d1108","Type":"ContainerStarted","Data":"e7f1897c8c1cf8d0ec4eae42e97bcc69cd863b9babd164a19660b7fc89f760bd"} Mar 08 03:25:22.510160 master-0 kubenswrapper[13046]: I0308 03:25:22.510083 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-85drn" podStartSLOduration=0.99802516 podStartE2EDuration="3.510063246s" podCreationTimestamp="2026-03-08 03:25:19 +0000 UTC" firstStartedPulling="2026-03-08 03:25:19.708630229 +0000 UTC m=+721.787397446" lastFinishedPulling="2026-03-08 03:25:22.220668315 +0000 UTC m=+724.299435532" observedRunningTime="2026-03-08 03:25:22.505062954 +0000 UTC m=+724.583830201" watchObservedRunningTime="2026-03-08 03:25:22.510063246 +0000 UTC m=+724.588830463" Mar 08 03:25:22.532384 master-0 kubenswrapper[13046]: I0308 03:25:22.532033 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7" podStartSLOduration=2.532013338 podStartE2EDuration="2.532013338s" podCreationTimestamp="2026-03-08 03:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:22.529954949 +0000 UTC m=+724.608722166" watchObservedRunningTime="2026-03-08 03:25:22.532013338 +0000 UTC m=+724.610780555" Mar 08 03:25:22.693517 master-0 kubenswrapper[13046]: I0308 03:25:22.693391 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt"] Mar 08 03:25:22.694297 master-0 kubenswrapper[13046]: I0308 03:25:22.694270 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.696863 master-0 kubenswrapper[13046]: I0308 03:25:22.696829 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-lr548" Mar 08 03:25:22.699437 master-0 kubenswrapper[13046]: I0308 03:25:22.699353 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 03:25:22.762998 master-0 kubenswrapper[13046]: I0308 03:25:22.762939 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt"] Mar 08 03:25:22.776783 master-0 kubenswrapper[13046]: I0308 03:25:22.776564 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.776783 master-0 kubenswrapper[13046]: I0308 03:25:22.776610 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.776783 master-0 kubenswrapper[13046]: I0308 03:25:22.776666 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhrl\" (UniqueName: \"kubernetes.io/projected/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-kube-api-access-xdhrl\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.878073 master-0 kubenswrapper[13046]: I0308 03:25:22.878004 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.878073 master-0 kubenswrapper[13046]: I0308 03:25:22.878068 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.878294 master-0 kubenswrapper[13046]: I0308 03:25:22.878145 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhrl\" (UniqueName: \"kubernetes.io/projected/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-kube-api-access-xdhrl\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.879127 master-0 kubenswrapper[13046]: I0308 03:25:22.879104 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.881556 master-0 kubenswrapper[13046]: I0308 03:25:22.881521 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:22.900290 master-0 kubenswrapper[13046]: I0308 03:25:22.900266 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhrl\" (UniqueName: \"kubernetes.io/projected/b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5-kube-api-access-xdhrl\") pod \"machine-config-controller-ff46b7bdf-t8jqt\" (UID: \"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:23.011722 master-0 kubenswrapper[13046]: I0308 03:25:23.011337 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" Mar 08 03:25:24.312305 master-0 kubenswrapper[13046]: I0308 03:25:24.310981 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt"] Mar 08 03:25:24.511234 master-0 kubenswrapper[13046]: I0308 03:25:24.511193 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" event={"ID":"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5","Type":"ContainerStarted","Data":"7120f5d67194aedaf81ff302fdf76e2fd8df4ff37fa1848838b2e169da1fb56d"} Mar 08 03:25:24.511234 master-0 kubenswrapper[13046]: I0308 03:25:24.511237 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" event={"ID":"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5","Type":"ContainerStarted","Data":"c830300a02fed6225b8e100c7c49888a7570e878fea31309be6b7698a0ec33f1"} Mar 08 03:25:24.512733 master-0 kubenswrapper[13046]: I0308 03:25:24.512700 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" event={"ID":"ec752a2e-4b18-4f4d-af88-19594345ae1c","Type":"ContainerStarted","Data":"1c318b21b952155176941dd22112851dd582e4c451db7c8cb3d109f0acbd6958"} Mar 08 03:25:24.512999 master-0 kubenswrapper[13046]: I0308 03:25:24.512974 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:24.551521 master-0 kubenswrapper[13046]: I0308 03:25:24.547163 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" podStartSLOduration=2.6170970909999998 podStartE2EDuration="5.547148212s" podCreationTimestamp="2026-03-08 03:25:19 +0000 UTC" firstStartedPulling="2026-03-08 03:25:21.002917866 +0000 UTC m=+723.081685093" lastFinishedPulling="2026-03-08 03:25:23.932968997 +0000 UTC m=+726.011736214" observedRunningTime="2026-03-08 03:25:24.542358946 +0000 UTC m=+726.621126163" watchObservedRunningTime="2026-03-08 03:25:24.547148212 +0000 UTC m=+726.625915429" Mar 08 03:25:24.648410 master-0 kubenswrapper[13046]: I0308 03:25:24.648350 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-5s2m2"] Mar 08 03:25:24.649337 master-0 kubenswrapper[13046]: I0308 03:25:24.649311 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.651778 master-0 kubenswrapper[13046]: I0308 03:25:24.651736 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 03:25:24.651874 master-0 kubenswrapper[13046]: I0308 03:25:24.651792 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 03:25:24.652616 master-0 kubenswrapper[13046]: I0308 03:25:24.652568 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w"] Mar 08 03:25:24.652876 master-0 kubenswrapper[13046]: I0308 03:25:24.652676 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 03:25:24.653403 master-0 kubenswrapper[13046]: I0308 03:25:24.653378 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" Mar 08 03:25:24.654731 master-0 kubenswrapper[13046]: I0308 03:25:24.654692 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 03:25:24.655169 master-0 kubenswrapper[13046]: I0308 03:25:24.655000 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 03:25:24.658936 master-0 kubenswrapper[13046]: I0308 03:25:24.658906 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24"] Mar 08 03:25:24.659694 master-0 kubenswrapper[13046]: I0308 03:25:24.659675 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:24.660742 master-0 kubenswrapper[13046]: I0308 03:25:24.660716 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 03:25:24.660919 master-0 kubenswrapper[13046]: I0308 03:25:24.660882 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-h99p2" Mar 08 03:25:24.663970 master-0 kubenswrapper[13046]: I0308 03:25:24.663937 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vfssr"] Mar 08 03:25:24.665330 master-0 kubenswrapper[13046]: I0308 03:25:24.665311 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.667710 master-0 kubenswrapper[13046]: I0308 03:25:24.667679 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 03:25:24.669251 master-0 kubenswrapper[13046]: I0308 03:25:24.669211 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 03:25:24.669510 master-0 kubenswrapper[13046]: I0308 03:25:24.669473 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 03:25:24.671553 master-0 kubenswrapper[13046]: I0308 03:25:24.671516 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 03:25:24.671613 master-0 kubenswrapper[13046]: I0308 03:25:24.671554 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-c72nd" Mar 08 03:25:24.672464 master-0 kubenswrapper[13046]: I0308 03:25:24.672418 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w"] Mar 08 03:25:24.684757 master-0 kubenswrapper[13046]: I0308 03:25:24.684610 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24"] Mar 08 03:25:24.694966 master-0 kubenswrapper[13046]: I0308 03:25:24.694939 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vfssr"] Mar 08 03:25:24.803299 master-0 kubenswrapper[13046]: I0308 03:25:24.803203 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb4b\" (UniqueName: \"kubernetes.io/projected/afd61ed2-3f0b-4f56-a99a-d93145461181-kube-api-access-xdb4b\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.803593 master-0 kubenswrapper[13046]: I0308 03:25:24.803568 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afd61ed2-3f0b-4f56-a99a-d93145461181-service-ca-bundle\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.803724 master-0 kubenswrapper[13046]: I0308 03:25:24.803703 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-default-certificate\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.803851 master-0 kubenswrapper[13046]: I0308 03:25:24.803830 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-stats-auth\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.804090 master-0 kubenswrapper[13046]: I0308 03:25:24.804070 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shv4q\" (UniqueName: \"kubernetes.io/projected/7f273cc5-a290-421e-9ad9-b6f0db792fe2-kube-api-access-shv4q\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.804219 master-0 kubenswrapper[13046]: I0308 03:25:24.804199 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f228w\" (UniqueName: \"kubernetes.io/projected/a3a7f795-62ae-4c49-a4a0-e68931bfd4d5-kube-api-access-f228w\") pod \"network-check-source-7c67b67d47-pts2w\" (UID: \"a3a7f795-62ae-4c49-a4a0-e68931bfd4d5\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" Mar 08 03:25:24.805404 master-0 kubenswrapper[13046]: I0308 03:25:24.804548 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f273cc5-a290-421e-9ad9-b6f0db792fe2-cert\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.805404 master-0 kubenswrapper[13046]: I0308 03:25:24.805332 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-metrics-certs\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.805404 master-0 kubenswrapper[13046]: I0308 03:25:24.805392 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e96e2bdd-2b4f-45c9-8db0-4b910d86d62d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-2zr24\" (UID: \"e96e2bdd-2b4f-45c9-8db0-4b910d86d62d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:24.830643 master-0 kubenswrapper[13046]: I0308 03:25:24.830601 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-flsjg" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.906762 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-stats-auth\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.906862 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shv4q\" (UniqueName: \"kubernetes.io/projected/7f273cc5-a290-421e-9ad9-b6f0db792fe2-kube-api-access-shv4q\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.906913 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f228w\" (UniqueName: \"kubernetes.io/projected/a3a7f795-62ae-4c49-a4a0-e68931bfd4d5-kube-api-access-f228w\") pod \"network-check-source-7c67b67d47-pts2w\" (UID: \"a3a7f795-62ae-4c49-a4a0-e68931bfd4d5\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.906953 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f273cc5-a290-421e-9ad9-b6f0db792fe2-cert\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.906980 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-metrics-certs\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.907006 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e96e2bdd-2b4f-45c9-8db0-4b910d86d62d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-2zr24\" (UID: \"e96e2bdd-2b4f-45c9-8db0-4b910d86d62d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.907043 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb4b\" (UniqueName: \"kubernetes.io/projected/afd61ed2-3f0b-4f56-a99a-d93145461181-kube-api-access-xdb4b\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.907072 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afd61ed2-3f0b-4f56-a99a-d93145461181-service-ca-bundle\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.907196 master-0 kubenswrapper[13046]: I0308 03:25:24.907107 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-default-certificate\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.909889 master-0 kubenswrapper[13046]: I0308 03:25:24.909839 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afd61ed2-3f0b-4f56-a99a-d93145461181-service-ca-bundle\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.915756 master-0 kubenswrapper[13046]: I0308 03:25:24.915608 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e96e2bdd-2b4f-45c9-8db0-4b910d86d62d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-2zr24\" (UID: \"e96e2bdd-2b4f-45c9-8db0-4b910d86d62d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:24.915938 master-0 kubenswrapper[13046]: I0308 03:25:24.915913 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-metrics-certs\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.915999 master-0 kubenswrapper[13046]: I0308 03:25:24.915934 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-default-certificate\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.916060 master-0 kubenswrapper[13046]: I0308 03:25:24.916045 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7f273cc5-a290-421e-9ad9-b6f0db792fe2-cert\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.916669 master-0 kubenswrapper[13046]: I0308 03:25:24.916590 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/afd61ed2-3f0b-4f56-a99a-d93145461181-stats-auth\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.927174 master-0 kubenswrapper[13046]: I0308 03:25:24.926523 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f228w\" (UniqueName: \"kubernetes.io/projected/a3a7f795-62ae-4c49-a4a0-e68931bfd4d5-kube-api-access-f228w\") pod \"network-check-source-7c67b67d47-pts2w\" (UID: \"a3a7f795-62ae-4c49-a4a0-e68931bfd4d5\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" Mar 08 03:25:24.927174 master-0 kubenswrapper[13046]: I0308 03:25:24.926566 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb4b\" (UniqueName: \"kubernetes.io/projected/afd61ed2-3f0b-4f56-a99a-d93145461181-kube-api-access-xdb4b\") pod \"router-default-79f8cd6fdd-5s2m2\" (UID: \"afd61ed2-3f0b-4f56-a99a-d93145461181\") " pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.929178 master-0 kubenswrapper[13046]: I0308 03:25:24.929140 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shv4q\" (UniqueName: \"kubernetes.io/projected/7f273cc5-a290-421e-9ad9-b6f0db792fe2-kube-api-access-shv4q\") pod \"ingress-canary-vfssr\" (UID: \"7f273cc5-a290-421e-9ad9-b6f0db792fe2\") " pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:24.971204 master-0 kubenswrapper[13046]: I0308 03:25:24.971134 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:24.994949 master-0 kubenswrapper[13046]: I0308 03:25:24.994664 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" Mar 08 03:25:25.024622 master-0 kubenswrapper[13046]: I0308 03:25:25.024445 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:25.028663 master-0 kubenswrapper[13046]: I0308 03:25:25.027573 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-h4zwb"] Mar 08 03:25:25.028663 master-0 kubenswrapper[13046]: I0308 03:25:25.028325 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:25:25.031360 master-0 kubenswrapper[13046]: I0308 03:25:25.031323 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 03:25:25.031532 master-0 kubenswrapper[13046]: I0308 03:25:25.031510 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 03:25:25.032674 master-0 kubenswrapper[13046]: I0308 03:25:25.031638 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-7c8qc" Mar 08 03:25:25.037869 master-0 kubenswrapper[13046]: I0308 03:25:25.037789 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-h4zwb"] Mar 08 03:25:25.042337 master-0 kubenswrapper[13046]: I0308 03:25:25.041084 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vfssr" Mar 08 03:25:25.109408 master-0 kubenswrapper[13046]: I0308 03:25:25.109325 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgfrz\" (UniqueName: \"kubernetes.io/projected/9eb92440-4e70-4fa6-9315-444d6f99e287-kube-api-access-lgfrz\") pod \"downloads-84f57b9877-h4zwb\" (UID: \"9eb92440-4e70-4fa6-9315-444d6f99e287\") " pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:25:25.210408 master-0 kubenswrapper[13046]: I0308 03:25:25.210344 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgfrz\" (UniqueName: \"kubernetes.io/projected/9eb92440-4e70-4fa6-9315-444d6f99e287-kube-api-access-lgfrz\") pod \"downloads-84f57b9877-h4zwb\" (UID: \"9eb92440-4e70-4fa6-9315-444d6f99e287\") " pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:25:25.225615 master-0 kubenswrapper[13046]: I0308 03:25:25.225572 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgfrz\" (UniqueName: \"kubernetes.io/projected/9eb92440-4e70-4fa6-9315-444d6f99e287-kube-api-access-lgfrz\") pod \"downloads-84f57b9877-h4zwb\" (UID: \"9eb92440-4e70-4fa6-9315-444d6f99e287\") " pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:25:25.363598 master-0 kubenswrapper[13046]: I0308 03:25:25.362195 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:25:25.430808 master-0 kubenswrapper[13046]: I0308 03:25:25.430082 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w"] Mar 08 03:25:25.435418 master-0 kubenswrapper[13046]: W0308 03:25:25.435354 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a7f795_62ae_4c49_a4a0_e68931bfd4d5.slice/crio-f5d9a78f017ba58ae158a00543ef2074dfb931998bda85a8856693b32c5f97ee WatchSource:0}: Error finding container f5d9a78f017ba58ae158a00543ef2074dfb931998bda85a8856693b32c5f97ee: Status 404 returned error can't find the container with id f5d9a78f017ba58ae158a00543ef2074dfb931998bda85a8856693b32c5f97ee Mar 08 03:25:25.505370 master-0 kubenswrapper[13046]: I0308 03:25:25.503141 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vfssr"] Mar 08 03:25:25.506722 master-0 kubenswrapper[13046]: W0308 03:25:25.506622 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f273cc5_a290_421e_9ad9_b6f0db792fe2.slice/crio-9b1a870005ebebe91026f3b3c556f208184a6990fb122d3dd35a600c94f21d83 WatchSource:0}: Error finding container 9b1a870005ebebe91026f3b3c556f208184a6990fb122d3dd35a600c94f21d83: Status 404 returned error can't find the container with id 9b1a870005ebebe91026f3b3c556f208184a6990fb122d3dd35a600c94f21d83 Mar 08 03:25:25.508544 master-0 kubenswrapper[13046]: I0308 03:25:25.508019 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24"] Mar 08 03:25:25.542714 master-0 kubenswrapper[13046]: I0308 03:25:25.542106 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" event={"ID":"b8b6b03e-5c45-489d-beb4-a9eedb7fe8a5","Type":"ContainerStarted","Data":"df435af69a9c3fa6159dfc32b252e84cf354ed19cd8919b00cf9e303e70a001d"} Mar 08 03:25:25.555610 master-0 kubenswrapper[13046]: I0308 03:25:25.543898 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vfssr" event={"ID":"7f273cc5-a290-421e-9ad9-b6f0db792fe2","Type":"ContainerStarted","Data":"9b1a870005ebebe91026f3b3c556f208184a6990fb122d3dd35a600c94f21d83"} Mar 08 03:25:25.555610 master-0 kubenswrapper[13046]: I0308 03:25:25.544922 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" event={"ID":"a3a7f795-62ae-4c49-a4a0-e68931bfd4d5","Type":"ContainerStarted","Data":"f5d9a78f017ba58ae158a00543ef2074dfb931998bda85a8856693b32c5f97ee"} Mar 08 03:25:25.555610 master-0 kubenswrapper[13046]: I0308 03:25:25.545896 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" event={"ID":"afd61ed2-3f0b-4f56-a99a-d93145461181","Type":"ContainerStarted","Data":"aacb28bd895de285b41a734292d1b28023e40101e69022de6d48266fc04b5532"} Mar 08 03:25:25.555610 master-0 kubenswrapper[13046]: I0308 03:25:25.550469 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" event={"ID":"e96e2bdd-2b4f-45c9-8db0-4b910d86d62d","Type":"ContainerStarted","Data":"f607535a33d3b990f5d498a85e71265bd1b84eaccdbe20a95015ecc1bad64f99"} Mar 08 03:25:25.726796 master-0 kubenswrapper[13046]: I0308 03:25:25.725090 13046 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 03:25:25.813569 master-0 kubenswrapper[13046]: I0308 03:25:25.812909 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-t8jqt" podStartSLOduration=3.8128930199999997 podStartE2EDuration="3.81289302s" podCreationTimestamp="2026-03-08 03:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:25.569776021 +0000 UTC m=+727.648543238" watchObservedRunningTime="2026-03-08 03:25:25.81289302 +0000 UTC m=+727.891660237" Mar 08 03:25:25.822667 master-0 kubenswrapper[13046]: I0308 03:25:25.822194 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-h4zwb"] Mar 08 03:25:26.564011 master-0 kubenswrapper[13046]: I0308 03:25:26.563963 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-h4zwb" event={"ID":"9eb92440-4e70-4fa6-9315-444d6f99e287","Type":"ContainerStarted","Data":"003df383bc41139add6d5cb2374a403e3ec24fc96253e28a12f31fa880812104"} Mar 08 03:25:26.569459 master-0 kubenswrapper[13046]: I0308 03:25:26.568689 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vfssr" event={"ID":"7f273cc5-a290-421e-9ad9-b6f0db792fe2","Type":"ContainerStarted","Data":"12c4faa458e3d38c5d85ffebeb3664ae7f1d9fd9431c8e8a3d577598d8952c6b"} Mar 08 03:25:26.571896 master-0 kubenswrapper[13046]: I0308 03:25:26.571801 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd"] Mar 08 03:25:26.573575 master-0 kubenswrapper[13046]: I0308 03:25:26.572961 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" event={"ID":"a3a7f795-62ae-4c49-a4a0-e68931bfd4d5","Type":"ContainerStarted","Data":"ec1180379c6af880b2c7d066facb8cfe5b6c56575875569f6393d97987bf4850"} Mar 08 03:25:26.573575 master-0 kubenswrapper[13046]: I0308 03:25:26.573181 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:26.574569 master-0 kubenswrapper[13046]: I0308 03:25:26.574250 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 03:25:26.574773 master-0 kubenswrapper[13046]: I0308 03:25:26.574668 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-5d9sz" Mar 08 03:25:26.574773 master-0 kubenswrapper[13046]: I0308 03:25:26.574706 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 03:25:26.585096 master-0 kubenswrapper[13046]: I0308 03:25:26.584994 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd"] Mar 08 03:25:26.588631 master-0 kubenswrapper[13046]: I0308 03:25:26.586853 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vfssr" podStartSLOduration=2.5868340119999997 podStartE2EDuration="2.586834012s" podCreationTimestamp="2026-03-08 03:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:26.585623368 +0000 UTC m=+728.664390585" watchObservedRunningTime="2026-03-08 03:25:26.586834012 +0000 UTC m=+728.665601229" Mar 08 03:25:26.645505 master-0 kubenswrapper[13046]: I0308 03:25:26.639048 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:26.645505 master-0 kubenswrapper[13046]: I0308 03:25:26.639159 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e507fbd-c7c5-4371-a316-1a00c7a5751a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:26.684502 master-0 kubenswrapper[13046]: I0308 03:25:26.683550 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-pts2w" podStartSLOduration=871.683532891 podStartE2EDuration="14m31.683532891s" podCreationTimestamp="2026-03-08 03:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:26.649680752 +0000 UTC m=+728.728447969" watchObservedRunningTime="2026-03-08 03:25:26.683532891 +0000 UTC m=+728.762300108" Mar 08 03:25:26.742064 master-0 kubenswrapper[13046]: I0308 03:25:26.742000 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:26.742459 master-0 kubenswrapper[13046]: E0308 03:25:26.742418 13046 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 08 03:25:26.742538 master-0 kubenswrapper[13046]: E0308 03:25:26.742506 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert podName:5e507fbd-c7c5-4371-a316-1a00c7a5751a nodeName:}" failed. No retries permitted until 2026-03-08 03:25:27.242476062 +0000 UTC m=+729.321243279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-pxnsd" (UID: "5e507fbd-c7c5-4371-a316-1a00c7a5751a") : secret "networking-console-plugin-cert" not found Mar 08 03:25:26.742802 master-0 kubenswrapper[13046]: I0308 03:25:26.742778 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e507fbd-c7c5-4371-a316-1a00c7a5751a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:26.744054 master-0 kubenswrapper[13046]: I0308 03:25:26.744026 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5e507fbd-c7c5-4371-a316-1a00c7a5751a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:27.179284 master-0 kubenswrapper[13046]: I0308 03:25:27.177110 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:25:27.179284 master-0 kubenswrapper[13046]: I0308 03:25:27.178230 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.191152 master-0 kubenswrapper[13046]: I0308 03:25:27.185540 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-gsmhw" Mar 08 03:25:27.191152 master-0 kubenswrapper[13046]: I0308 03:25:27.187571 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:25:27.191152 master-0 kubenswrapper[13046]: I0308 03:25:27.190101 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 08 03:25:27.249939 master-0 kubenswrapper[13046]: I0308 03:25:27.249863 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:27.250238 master-0 kubenswrapper[13046]: I0308 03:25:27.250152 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.250238 master-0 kubenswrapper[13046]: I0308 03:25:27.250205 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.250432 master-0 kubenswrapper[13046]: I0308 03:25:27.250376 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.252686 master-0 kubenswrapper[13046]: I0308 03:25:27.252648 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5e507fbd-c7c5-4371-a316-1a00c7a5751a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-pxnsd\" (UID: \"5e507fbd-c7c5-4371-a316-1a00c7a5751a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:27.352011 master-0 kubenswrapper[13046]: I0308 03:25:27.351857 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.352011 master-0 kubenswrapper[13046]: I0308 03:25:27.351976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.352011 master-0 kubenswrapper[13046]: I0308 03:25:27.352007 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.352259 master-0 kubenswrapper[13046]: I0308 03:25:27.352028 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.352259 master-0 kubenswrapper[13046]: I0308 03:25:27.351976 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.375967 master-0 kubenswrapper[13046]: I0308 03:25:27.375881 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.502315 master-0 kubenswrapper[13046]: I0308 03:25:27.502209 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:25:27.520331 master-0 kubenswrapper[13046]: I0308 03:25:27.520285 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" Mar 08 03:25:28.187885 master-0 kubenswrapper[13046]: I0308 03:25:28.187574 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bqwmx"] Mar 08 03:25:28.189218 master-0 kubenswrapper[13046]: I0308 03:25:28.189191 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.194551 master-0 kubenswrapper[13046]: I0308 03:25:28.194506 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-g6bk7" Mar 08 03:25:28.194787 master-0 kubenswrapper[13046]: I0308 03:25:28.194767 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 03:25:28.195358 master-0 kubenswrapper[13046]: I0308 03:25:28.195294 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 03:25:28.284770 master-0 kubenswrapper[13046]: I0308 03:25:28.284595 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2k9\" (UniqueName: \"kubernetes.io/projected/265ceba1-ee15-45c5-a422-7b721506b244-kube-api-access-gt2k9\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.284770 master-0 kubenswrapper[13046]: I0308 03:25:28.284641 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-certs\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.284770 master-0 kubenswrapper[13046]: I0308 03:25:28.284664 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-node-bootstrap-token\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.392412 master-0 kubenswrapper[13046]: I0308 03:25:28.392078 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2k9\" (UniqueName: \"kubernetes.io/projected/265ceba1-ee15-45c5-a422-7b721506b244-kube-api-access-gt2k9\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.392412 master-0 kubenswrapper[13046]: I0308 03:25:28.392122 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-certs\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.393230 master-0 kubenswrapper[13046]: I0308 03:25:28.392633 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-node-bootstrap-token\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.396238 master-0 kubenswrapper[13046]: I0308 03:25:28.396103 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-certs\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.397931 master-0 kubenswrapper[13046]: I0308 03:25:28.397882 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/265ceba1-ee15-45c5-a422-7b721506b244-node-bootstrap-token\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.437998 master-0 kubenswrapper[13046]: I0308 03:25:28.437560 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2k9\" (UniqueName: \"kubernetes.io/projected/265ceba1-ee15-45c5-a422-7b721506b244-kube-api-access-gt2k9\") pod \"machine-config-server-bqwmx\" (UID: \"265ceba1-ee15-45c5-a422-7b721506b244\") " pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.516834 master-0 kubenswrapper[13046]: I0308 03:25:28.516709 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bqwmx" Mar 08 03:25:28.594633 master-0 kubenswrapper[13046]: I0308 03:25:28.593860 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" event={"ID":"e96e2bdd-2b4f-45c9-8db0-4b910d86d62d","Type":"ContainerStarted","Data":"54121f71ad35dd2b1a4b38991936401889ed2b4030b6e904c2ed364931248222"} Mar 08 03:25:28.594824 master-0 kubenswrapper[13046]: I0308 03:25:28.594746 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:28.596403 master-0 kubenswrapper[13046]: I0308 03:25:28.596326 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bqwmx" event={"ID":"265ceba1-ee15-45c5-a422-7b721506b244","Type":"ContainerStarted","Data":"307b3b9eecdb9e971983e287941902066430a85c5e2daed862eb35b5b17a728a"} Mar 08 03:25:28.603650 master-0 kubenswrapper[13046]: I0308 03:25:28.598288 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" Mar 08 03:25:28.603650 master-0 kubenswrapper[13046]: I0308 03:25:28.598413 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" event={"ID":"afd61ed2-3f0b-4f56-a99a-d93145461181","Type":"ContainerStarted","Data":"7b34364fecf795529fdb256246187f06ac691e2100cfa74dd3aeffc69f5d5533"} Mar 08 03:25:28.615611 master-0 kubenswrapper[13046]: I0308 03:25:28.615515 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-2zr24" podStartSLOduration=776.865993835 podStartE2EDuration="12m59.615459999s" podCreationTimestamp="2026-03-08 03:12:29 +0000 UTC" firstStartedPulling="2026-03-08 03:25:25.519542827 +0000 UTC m=+727.598310044" lastFinishedPulling="2026-03-08 03:25:28.269008991 +0000 UTC m=+730.347776208" observedRunningTime="2026-03-08 03:25:28.611199898 +0000 UTC m=+730.689967115" watchObservedRunningTime="2026-03-08 03:25:28.615459999 +0000 UTC m=+730.694227216" Mar 08 03:25:28.651852 master-0 kubenswrapper[13046]: I0308 03:25:28.651784 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" podStartSLOduration=792.39191793 podStartE2EDuration="13m15.651764077s" podCreationTimestamp="2026-03-08 03:12:13 +0000 UTC" firstStartedPulling="2026-03-08 03:25:25.007680562 +0000 UTC m=+727.086447779" lastFinishedPulling="2026-03-08 03:25:28.267526709 +0000 UTC m=+730.346293926" observedRunningTime="2026-03-08 03:25:28.649981007 +0000 UTC m=+730.728748224" watchObservedRunningTime="2026-03-08 03:25:28.651764077 +0000 UTC m=+730.730531294" Mar 08 03:25:28.765597 master-0 kubenswrapper[13046]: I0308 03:25:28.763540 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd"] Mar 08 03:25:28.832752 master-0 kubenswrapper[13046]: I0308 03:25:28.832642 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 08 03:25:28.972129 master-0 kubenswrapper[13046]: I0308 03:25:28.972068 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:28.975670 master-0 kubenswrapper[13046]: I0308 03:25:28.975288 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:29.546949 master-0 kubenswrapper[13046]: I0308 03:25:29.545359 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-ck44k"] Mar 08 03:25:29.551507 master-0 kubenswrapper[13046]: I0308 03:25:29.548959 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.552758 master-0 kubenswrapper[13046]: I0308 03:25:29.552030 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 03:25:29.552758 master-0 kubenswrapper[13046]: I0308 03:25:29.552372 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-7kngk" Mar 08 03:25:29.552758 master-0 kubenswrapper[13046]: I0308 03:25:29.552541 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 03:25:29.552758 master-0 kubenswrapper[13046]: I0308 03:25:29.552667 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 03:25:29.565928 master-0 kubenswrapper[13046]: I0308 03:25:29.565634 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-ck44k"] Mar 08 03:25:29.606415 master-0 kubenswrapper[13046]: I0308 03:25:29.606371 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bqwmx" event={"ID":"265ceba1-ee15-45c5-a422-7b721506b244","Type":"ContainerStarted","Data":"7256c040d7941c1514ca83e126bd230f04ae237ecd2e6e860ebb96915d3ed12a"} Mar 08 03:25:29.608594 master-0 kubenswrapper[13046]: I0308 03:25:29.608547 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" event={"ID":"5e507fbd-c7c5-4371-a316-1a00c7a5751a","Type":"ContainerStarted","Data":"2205a995b9671ba8771fe712344e977cb66c9488af28872ce4cf4eed26e5853d"} Mar 08 03:25:29.612530 master-0 kubenswrapper[13046]: I0308 03:25:29.612497 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"e8d88f12-2fa2-4f01-badf-3543770a14f1","Type":"ContainerStarted","Data":"c039518268c4e5b626b34db008084a8b51e0328a3e98352d92c8cc40d2d679f2"} Mar 08 03:25:29.612623 master-0 kubenswrapper[13046]: I0308 03:25:29.612539 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:29.612623 master-0 kubenswrapper[13046]: I0308 03:25:29.612550 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"e8d88f12-2fa2-4f01-badf-3543770a14f1","Type":"ContainerStarted","Data":"ea3ee0427921ab302a9830fe327e17855ab0fec493db468f66488876aa20cb24"} Mar 08 03:25:29.614779 master-0 kubenswrapper[13046]: I0308 03:25:29.614754 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-5s2m2" Mar 08 03:25:29.630395 master-0 kubenswrapper[13046]: I0308 03:25:29.627320 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bqwmx" podStartSLOduration=1.627302082 podStartE2EDuration="1.627302082s" podCreationTimestamp="2026-03-08 03:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:29.61912899 +0000 UTC m=+731.697896207" watchObservedRunningTime="2026-03-08 03:25:29.627302082 +0000 UTC m=+731.706069299" Mar 08 03:25:29.647128 master-0 kubenswrapper[13046]: I0308 03:25:29.647028 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.64700539 podStartE2EDuration="2.64700539s" podCreationTimestamp="2026-03-08 03:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:29.640582908 +0000 UTC m=+731.719350145" watchObservedRunningTime="2026-03-08 03:25:29.64700539 +0000 UTC m=+731.725772617" Mar 08 03:25:29.728533 master-0 kubenswrapper[13046]: I0308 03:25:29.726323 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trv25\" (UniqueName: \"kubernetes.io/projected/08259d7a-3093-4f7d-b1ef-04f0f954e986-kube-api-access-trv25\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.728533 master-0 kubenswrapper[13046]: I0308 03:25:29.726402 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.728533 master-0 kubenswrapper[13046]: I0308 03:25:29.726438 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08259d7a-3093-4f7d-b1ef-04f0f954e986-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.728533 master-0 kubenswrapper[13046]: I0308 03:25:29.726458 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.827193 master-0 kubenswrapper[13046]: I0308 03:25:29.827056 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.827193 master-0 kubenswrapper[13046]: I0308 03:25:29.827138 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08259d7a-3093-4f7d-b1ef-04f0f954e986-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.827430 master-0 kubenswrapper[13046]: I0308 03:25:29.827324 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.830548 master-0 kubenswrapper[13046]: I0308 03:25:29.827551 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv25\" (UniqueName: \"kubernetes.io/projected/08259d7a-3093-4f7d-b1ef-04f0f954e986-kube-api-access-trv25\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.830548 master-0 kubenswrapper[13046]: I0308 03:25:29.828103 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08259d7a-3093-4f7d-b1ef-04f0f954e986-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.830548 master-0 kubenswrapper[13046]: I0308 03:25:29.830352 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.843629 master-0 kubenswrapper[13046]: I0308 03:25:29.841546 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/08259d7a-3093-4f7d-b1ef-04f0f954e986-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.852919 master-0 kubenswrapper[13046]: I0308 03:25:29.852863 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv25\" (UniqueName: \"kubernetes.io/projected/08259d7a-3093-4f7d-b1ef-04f0f954e986-kube-api-access-trv25\") pod \"prometheus-operator-5ff8674d55-ck44k\" (UID: \"08259d7a-3093-4f7d-b1ef-04f0f954e986\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:29.879514 master-0 kubenswrapper[13046]: I0308 03:25:29.879454 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" Mar 08 03:25:30.307291 master-0 kubenswrapper[13046]: I0308 03:25:30.307153 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-ck44k"] Mar 08 03:25:31.629571 master-0 kubenswrapper[13046]: I0308 03:25:31.629318 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" event={"ID":"08259d7a-3093-4f7d-b1ef-04f0f954e986","Type":"ContainerStarted","Data":"7bc1561bfef2636fcb85105e8299834ff04ed3ed26d7acff1ac990fbed1c08d3"} Mar 08 03:25:31.630878 master-0 kubenswrapper[13046]: I0308 03:25:31.630431 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" event={"ID":"5e507fbd-c7c5-4371-a316-1a00c7a5751a","Type":"ContainerStarted","Data":"5055aaab855b641294dff1af1b3c229ef0b9b43986802c253a3b42a15067a0c0"} Mar 08 03:25:32.650511 master-0 kubenswrapper[13046]: I0308 03:25:32.646929 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" event={"ID":"08259d7a-3093-4f7d-b1ef-04f0f954e986","Type":"ContainerStarted","Data":"58b255a26df3b9a73b716fb58a3044c530da31a66eced71158579e830b0b7994"} Mar 08 03:25:32.867224 master-0 kubenswrapper[13046]: I0308 03:25:32.867093 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-pxnsd" podStartSLOduration=4.858772329 podStartE2EDuration="6.86707559s" podCreationTimestamp="2026-03-08 03:25:26 +0000 UTC" firstStartedPulling="2026-03-08 03:25:28.780605668 +0000 UTC m=+730.859372885" lastFinishedPulling="2026-03-08 03:25:30.788908929 +0000 UTC m=+732.867676146" observedRunningTime="2026-03-08 03:25:31.650213336 +0000 UTC m=+733.728980563" watchObservedRunningTime="2026-03-08 03:25:32.86707559 +0000 UTC m=+734.945842807" Mar 08 03:25:32.869763 master-0 kubenswrapper[13046]: I0308 03:25:32.869717 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:25:32.872502 master-0 kubenswrapper[13046]: I0308 03:25:32.872362 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.878512 master-0 kubenswrapper[13046]: I0308 03:25:32.877189 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 03:25:32.878512 master-0 kubenswrapper[13046]: I0308 03:25:32.877327 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 03:25:32.883523 master-0 kubenswrapper[13046]: I0308 03:25:32.882762 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 03:25:32.883523 master-0 kubenswrapper[13046]: I0308 03:25:32.883189 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 03:25:32.883523 master-0 kubenswrapper[13046]: I0308 03:25:32.883410 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.883834 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-9f7bj" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.883974 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.884061 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.884154 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.884273 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.884401 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 03:25:32.885706 master-0 kubenswrapper[13046]: I0308 03:25:32.884953 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 03:25:32.892320 master-0 kubenswrapper[13046]: I0308 03:25:32.892146 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:25:32.895115 master-0 kubenswrapper[13046]: I0308 03:25:32.895079 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 03:25:32.917448 master-0 kubenswrapper[13046]: I0308 03:25:32.917398 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 03:25:32.983687 master-0 kubenswrapper[13046]: I0308 03:25:32.983631 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74hmp\" (UniqueName: \"kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983701 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983730 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983751 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983797 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983818 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983834 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983857 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.983873 master-0 kubenswrapper[13046]: I0308 03:25:32.983872 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.984109 master-0 kubenswrapper[13046]: I0308 03:25:32.983897 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.984109 master-0 kubenswrapper[13046]: I0308 03:25:32.983917 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.984109 master-0 kubenswrapper[13046]: I0308 03:25:32.983938 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:32.984109 master-0 kubenswrapper[13046]: I0308 03:25:32.983967 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086532 master-0 kubenswrapper[13046]: I0308 03:25:33.086471 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74hmp\" (UniqueName: \"kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086747 master-0 kubenswrapper[13046]: I0308 03:25:33.086583 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086747 master-0 kubenswrapper[13046]: I0308 03:25:33.086613 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086747 master-0 kubenswrapper[13046]: I0308 03:25:33.086650 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086747 master-0 kubenswrapper[13046]: I0308 03:25:33.086697 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086747 master-0 kubenswrapper[13046]: I0308 03:25:33.086719 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086735 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086842 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086859 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086907 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086925 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.086954 master-0 kubenswrapper[13046]: I0308 03:25:33.086956 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.087461 master-0 kubenswrapper[13046]: I0308 03:25:33.086989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.087675 master-0 kubenswrapper[13046]: I0308 03:25:33.087623 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.089505 master-0 kubenswrapper[13046]: I0308 03:25:33.089447 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.089994 master-0 kubenswrapper[13046]: I0308 03:25:33.089953 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.090746 master-0 kubenswrapper[13046]: I0308 03:25:33.090715 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.091772 master-0 kubenswrapper[13046]: I0308 03:25:33.091747 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.092131 master-0 kubenswrapper[13046]: I0308 03:25:33.092093 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.093334 master-0 kubenswrapper[13046]: I0308 03:25:33.093301 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.093622 master-0 kubenswrapper[13046]: I0308 03:25:33.093584 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.094526 master-0 kubenswrapper[13046]: I0308 03:25:33.094466 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.095889 master-0 kubenswrapper[13046]: I0308 03:25:33.095855 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.097130 master-0 kubenswrapper[13046]: I0308 03:25:33.097086 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.101684 master-0 kubenswrapper[13046]: I0308 03:25:33.101642 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.104993 master-0 kubenswrapper[13046]: I0308 03:25:33.104947 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74hmp\" (UniqueName: \"kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp\") pod \"oauth-openshift-5fff964fcf-s5mbf\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.199562 master-0 kubenswrapper[13046]: I0308 03:25:33.198370 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:33.609141 master-0 kubenswrapper[13046]: I0308 03:25:33.609075 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:25:33.615775 master-0 kubenswrapper[13046]: W0308 03:25:33.615732 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f559362_f339_4de3_9666_757654e9c35e.slice/crio-f344d49c740f287f640ec7329405534c81d8db828faa4614d580010632defbdd WatchSource:0}: Error finding container f344d49c740f287f640ec7329405534c81d8db828faa4614d580010632defbdd: Status 404 returned error can't find the container with id f344d49c740f287f640ec7329405534c81d8db828faa4614d580010632defbdd Mar 08 03:25:33.664072 master-0 kubenswrapper[13046]: I0308 03:25:33.663987 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" event={"ID":"1f559362-f339-4de3-9666-757654e9c35e","Type":"ContainerStarted","Data":"f344d49c740f287f640ec7329405534c81d8db828faa4614d580010632defbdd"} Mar 08 03:25:33.665989 master-0 kubenswrapper[13046]: I0308 03:25:33.665944 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" event={"ID":"08259d7a-3093-4f7d-b1ef-04f0f954e986","Type":"ContainerStarted","Data":"4b97f3cca7fc79c0a461c84c653d1ae14182fe027fe3825b37f4c0f6ce5792d5"} Mar 08 03:25:33.701770 master-0 kubenswrapper[13046]: I0308 03:25:33.701666 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-ck44k" podStartSLOduration=3.091881168 podStartE2EDuration="4.701632675s" podCreationTimestamp="2026-03-08 03:25:29 +0000 UTC" firstStartedPulling="2026-03-08 03:25:30.752638491 +0000 UTC m=+732.831405708" lastFinishedPulling="2026-03-08 03:25:32.362389998 +0000 UTC m=+734.441157215" observedRunningTime="2026-03-08 03:25:33.689082437 +0000 UTC m=+735.767849664" watchObservedRunningTime="2026-03-08 03:25:33.701632675 +0000 UTC m=+735.780399932" Mar 08 03:25:34.198061 master-0 kubenswrapper[13046]: I0308 03:25:34.198015 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:25:34.198995 master-0 kubenswrapper[13046]: I0308 03:25:34.198975 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.202164 master-0 kubenswrapper[13046]: I0308 03:25:34.202004 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 03:25:34.202164 master-0 kubenswrapper[13046]: I0308 03:25:34.202050 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ks2rl" Mar 08 03:25:34.220264 master-0 kubenswrapper[13046]: I0308 03:25:34.220219 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:25:34.317503 master-0 kubenswrapper[13046]: I0308 03:25:34.315105 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.317503 master-0 kubenswrapper[13046]: I0308 03:25:34.315166 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.317503 master-0 kubenswrapper[13046]: I0308 03:25:34.315263 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.416525 master-0 kubenswrapper[13046]: I0308 03:25:34.416463 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.416756 master-0 kubenswrapper[13046]: I0308 03:25:34.416543 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.416756 master-0 kubenswrapper[13046]: I0308 03:25:34.416566 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.416756 master-0 kubenswrapper[13046]: I0308 03:25:34.416674 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.417068 master-0 kubenswrapper[13046]: I0308 03:25:34.416784 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.434684 master-0 kubenswrapper[13046]: I0308 03:25:34.434631 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.540764 master-0 kubenswrapper[13046]: I0308 03:25:34.540653 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:25:34.542892 master-0 kubenswrapper[13046]: I0308 03:25:34.541849 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.545576 master-0 kubenswrapper[13046]: I0308 03:25:34.544647 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:25:34.552651 master-0 kubenswrapper[13046]: I0308 03:25:34.552599 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 03:25:34.558500 master-0 kubenswrapper[13046]: I0308 03:25:34.553020 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 03:25:34.558500 master-0 kubenswrapper[13046]: I0308 03:25:34.553248 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 03:25:34.558500 master-0 kubenswrapper[13046]: I0308 03:25:34.553431 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-tqmc4" Mar 08 03:25:34.558500 master-0 kubenswrapper[13046]: I0308 03:25:34.554181 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 03:25:34.558500 master-0 kubenswrapper[13046]: I0308 03:25:34.554451 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 03:25:34.560047 master-0 kubenswrapper[13046]: I0308 03:25:34.558879 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:25:34.621278 master-0 kubenswrapper[13046]: I0308 03:25:34.621228 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.621471 master-0 kubenswrapper[13046]: I0308 03:25:34.621394 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.621471 master-0 kubenswrapper[13046]: I0308 03:25:34.621426 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4j6\" (UniqueName: \"kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.621471 master-0 kubenswrapper[13046]: I0308 03:25:34.621467 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.621592 master-0 kubenswrapper[13046]: I0308 03:25:34.621511 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.621592 master-0 kubenswrapper[13046]: I0308 03:25:34.621534 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.722947 master-0 kubenswrapper[13046]: I0308 03:25:34.722897 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.722947 master-0 kubenswrapper[13046]: I0308 03:25:34.722954 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4j6\" (UniqueName: \"kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.723530 master-0 kubenswrapper[13046]: I0308 03:25:34.723127 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.723530 master-0 kubenswrapper[13046]: I0308 03:25:34.723185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.723530 master-0 kubenswrapper[13046]: I0308 03:25:34.723219 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.723530 master-0 kubenswrapper[13046]: I0308 03:25:34.723266 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.726710 master-0 kubenswrapper[13046]: I0308 03:25:34.725662 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.726710 master-0 kubenswrapper[13046]: I0308 03:25:34.726060 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.727037 master-0 kubenswrapper[13046]: I0308 03:25:34.726992 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.731022 master-0 kubenswrapper[13046]: I0308 03:25:34.730979 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.739134 master-0 kubenswrapper[13046]: I0308 03:25:34.738563 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4j6\" (UniqueName: \"kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.743827 master-0 kubenswrapper[13046]: I0308 03:25:34.742007 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert\") pod \"console-dc7d49677-gsx8f\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:34.915381 master-0 kubenswrapper[13046]: I0308 03:25:34.915303 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:35.084889 master-0 kubenswrapper[13046]: I0308 03:25:35.079526 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 08 03:25:35.377642 master-0 kubenswrapper[13046]: I0308 03:25:35.377586 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:25:35.695561 master-0 kubenswrapper[13046]: I0308 03:25:35.695504 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"6c539a17-b57b-446a-b50d-976adc8766ef","Type":"ContainerStarted","Data":"b615173188d3c6ef1796ab2b05330b5d0d8e92137820ea937309af44c50f2c0e"} Mar 08 03:25:35.695561 master-0 kubenswrapper[13046]: I0308 03:25:35.695554 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"6c539a17-b57b-446a-b50d-976adc8766ef","Type":"ContainerStarted","Data":"aab46491c5b58c93bbde4dc1b1fbf43386a5ee6a7bfde311b9d73e2882ce63cf"} Mar 08 03:25:35.709595 master-0 kubenswrapper[13046]: I0308 03:25:35.708585 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc7d49677-gsx8f" event={"ID":"158946fc-eae1-4823-a93c-398d4aede495","Type":"ContainerStarted","Data":"19f7a1f66bac10972eeab9d6c4173ff7b45da8ecd66dc311e55906df50ba3772"} Mar 08 03:25:35.725143 master-0 kubenswrapper[13046]: I0308 03:25:35.725087 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=1.72507075 podStartE2EDuration="1.72507075s" podCreationTimestamp="2026-03-08 03:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:35.723133436 +0000 UTC m=+737.801900653" watchObservedRunningTime="2026-03-08 03:25:35.72507075 +0000 UTC m=+737.803837967" Mar 08 03:25:35.958756 master-0 kubenswrapper[13046]: I0308 03:25:35.958672 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt"] Mar 08 03:25:35.962534 master-0 kubenswrapper[13046]: I0308 03:25:35.962019 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:35.967544 master-0 kubenswrapper[13046]: I0308 03:25:35.966067 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-lf85f" Mar 08 03:25:35.967544 master-0 kubenswrapper[13046]: I0308 03:25:35.966229 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 03:25:35.967544 master-0 kubenswrapper[13046]: I0308 03:25:35.966327 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 03:25:35.975304 master-0 kubenswrapper[13046]: I0308 03:25:35.975256 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt"] Mar 08 03:25:36.041825 master-0 kubenswrapper[13046]: I0308 03:25:36.039611 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5"] Mar 08 03:25:36.041825 master-0 kubenswrapper[13046]: I0308 03:25:36.040922 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.047384 master-0 kubenswrapper[13046]: I0308 03:25:36.046975 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9xbqs" Mar 08 03:25:36.047384 master-0 kubenswrapper[13046]: I0308 03:25:36.047186 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 03:25:36.047384 master-0 kubenswrapper[13046]: I0308 03:25:36.047336 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 03:25:36.047607 master-0 kubenswrapper[13046]: I0308 03:25:36.047393 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 03:25:36.053463 master-0 kubenswrapper[13046]: I0308 03:25:36.050477 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfb5r\" (UniqueName: \"kubernetes.io/projected/268919d0-afa6-48ed-a6cb-3f558fc78b5d-kube-api-access-pfb5r\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.053463 master-0 kubenswrapper[13046]: I0308 03:25:36.050562 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.053463 master-0 kubenswrapper[13046]: I0308 03:25:36.050612 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.053463 master-0 kubenswrapper[13046]: I0308 03:25:36.050643 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/268919d0-afa6-48ed-a6cb-3f558fc78b5d-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.058101 master-0 kubenswrapper[13046]: I0308 03:25:36.057069 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rtdkr"] Mar 08 03:25:36.058719 master-0 kubenswrapper[13046]: I0308 03:25:36.058693 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.063373 master-0 kubenswrapper[13046]: I0308 03:25:36.062080 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 03:25:36.063373 master-0 kubenswrapper[13046]: I0308 03:25:36.062112 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-wxpwp" Mar 08 03:25:36.063373 master-0 kubenswrapper[13046]: I0308 03:25:36.062159 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 03:25:36.070630 master-0 kubenswrapper[13046]: I0308 03:25:36.069908 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5"] Mar 08 03:25:36.152468 master-0 kubenswrapper[13046]: I0308 03:25:36.152428 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.152710 master-0 kubenswrapper[13046]: I0308 03:25:36.152693 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/268919d0-afa6-48ed-a6cb-3f558fc78b5d-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.152793 master-0 kubenswrapper[13046]: I0308 03:25:36.152779 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.152866 master-0 kubenswrapper[13046]: I0308 03:25:36.152853 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ea520cd-5fd4-4354-8cbb-38539cbef506-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.152947 master-0 kubenswrapper[13046]: I0308 03:25:36.152934 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.153023 master-0 kubenswrapper[13046]: I0308 03:25:36.153011 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.153107 master-0 kubenswrapper[13046]: I0308 03:25:36.153093 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfb5r\" (UniqueName: \"kubernetes.io/projected/268919d0-afa6-48ed-a6cb-3f558fc78b5d-kube-api-access-pfb5r\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.153183 master-0 kubenswrapper[13046]: I0308 03:25:36.153171 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-tls\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.153285 master-0 kubenswrapper[13046]: I0308 03:25:36.153273 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-root\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.153418 master-0 kubenswrapper[13046]: I0308 03:25:36.153404 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-wtmp\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.153529 master-0 kubenswrapper[13046]: I0308 03:25:36.153516 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.153637 master-0 kubenswrapper[13046]: I0308 03:25:36.153607 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-textfile\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.153728 master-0 kubenswrapper[13046]: I0308 03:25:36.153715 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.153806 master-0 kubenswrapper[13046]: I0308 03:25:36.153793 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.153899 master-0 kubenswrapper[13046]: I0308 03:25:36.153872 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23651d40-89da-46c4-a6cb-b4c031e826cb-metrics-client-ca\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.153987 master-0 kubenswrapper[13046]: I0308 03:25:36.153974 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx98n\" (UniqueName: \"kubernetes.io/projected/23651d40-89da-46c4-a6cb-b4c031e826cb-kube-api-access-xx98n\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.154061 master-0 kubenswrapper[13046]: I0308 03:25:36.154049 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rth94\" (UniqueName: \"kubernetes.io/projected/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-api-access-rth94\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.154128 master-0 kubenswrapper[13046]: I0308 03:25:36.154117 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-sys\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.154652 master-0 kubenswrapper[13046]: E0308 03:25:36.154637 13046 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 08 03:25:36.154861 master-0 kubenswrapper[13046]: E0308 03:25:36.154747 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls podName:268919d0-afa6-48ed-a6cb-3f558fc78b5d nodeName:}" failed. No retries permitted until 2026-03-08 03:25:36.654732312 +0000 UTC m=+738.733499529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls") pod "openshift-state-metrics-74cc79fd76-mcvbt" (UID: "268919d0-afa6-48ed-a6cb-3f558fc78b5d") : secret "openshift-state-metrics-tls" not found Mar 08 03:25:36.158719 master-0 kubenswrapper[13046]: I0308 03:25:36.158398 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/268919d0-afa6-48ed-a6cb-3f558fc78b5d-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.162878 master-0 kubenswrapper[13046]: I0308 03:25:36.160078 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.183451 master-0 kubenswrapper[13046]: I0308 03:25:36.183163 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfb5r\" (UniqueName: \"kubernetes.io/projected/268919d0-afa6-48ed-a6cb-3f558fc78b5d-kube-api-access-pfb5r\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255554 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255618 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-textfile\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255668 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255694 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23651d40-89da-46c4-a6cb-b4c031e826cb-metrics-client-ca\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255713 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx98n\" (UniqueName: \"kubernetes.io/projected/23651d40-89da-46c4-a6cb-b4c031e826cb-kube-api-access-xx98n\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255731 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rth94\" (UniqueName: \"kubernetes.io/projected/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-api-access-rth94\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.255763 master-0 kubenswrapper[13046]: I0308 03:25:36.255751 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-sys\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255787 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255827 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ea520cd-5fd4-4354-8cbb-38539cbef506-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255855 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255875 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255895 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-tls\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255910 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-root\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.255937 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-wtmp\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.256120 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-wtmp\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.256566 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-sys\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.257107 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23651d40-89da-46c4-a6cb-b4c031e826cb-metrics-client-ca\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.257253 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-textfile\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.258130 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.260738 master-0 kubenswrapper[13046]: I0308 03:25:36.258684 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23651d40-89da-46c4-a6cb-b4c031e826cb-root\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.262529 master-0 kubenswrapper[13046]: I0308 03:25:36.262031 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.262529 master-0 kubenswrapper[13046]: I0308 03:25:36.262118 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2ea520cd-5fd4-4354-8cbb-38539cbef506-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.262529 master-0 kubenswrapper[13046]: I0308 03:25:36.262462 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2ea520cd-5fd4-4354-8cbb-38539cbef506-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.264711 master-0 kubenswrapper[13046]: I0308 03:25:36.263598 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-tls\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.264840 master-0 kubenswrapper[13046]: I0308 03:25:36.264788 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23651d40-89da-46c4-a6cb-b4c031e826cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.278756 master-0 kubenswrapper[13046]: I0308 03:25:36.278724 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.284609 master-0 kubenswrapper[13046]: I0308 03:25:36.282068 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rth94\" (UniqueName: \"kubernetes.io/projected/2ea520cd-5fd4-4354-8cbb-38539cbef506-kube-api-access-rth94\") pod \"kube-state-metrics-68b88f8cb5-t5lc5\" (UID: \"2ea520cd-5fd4-4354-8cbb-38539cbef506\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.287644 master-0 kubenswrapper[13046]: I0308 03:25:36.286228 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx98n\" (UniqueName: \"kubernetes.io/projected/23651d40-89da-46c4-a6cb-b4c031e826cb-kube-api-access-xx98n\") pod \"node-exporter-rtdkr\" (UID: \"23651d40-89da-46c4-a6cb-b4c031e826cb\") " pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.380474 master-0 kubenswrapper[13046]: I0308 03:25:36.379892 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" Mar 08 03:25:36.401967 master-0 kubenswrapper[13046]: I0308 03:25:36.401920 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rtdkr" Mar 08 03:25:36.673540 master-0 kubenswrapper[13046]: I0308 03:25:36.672310 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.675516 master-0 kubenswrapper[13046]: I0308 03:25:36.675448 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/268919d0-afa6-48ed-a6cb-3f558fc78b5d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-mcvbt\" (UID: \"268919d0-afa6-48ed-a6cb-3f558fc78b5d\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:36.894254 master-0 kubenswrapper[13046]: I0308 03:25:36.893252 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" Mar 08 03:25:37.745888 master-0 kubenswrapper[13046]: I0308 03:25:37.744458 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rtdkr" event={"ID":"23651d40-89da-46c4-a6cb-b4c031e826cb","Type":"ContainerStarted","Data":"410c0d6a82fd794c4427a63085ac0381fa3490d73c0362e0bb99f60fcbdeea5a"} Mar 08 03:25:38.754562 master-0 kubenswrapper[13046]: I0308 03:25:38.754458 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" event={"ID":"1f559362-f339-4de3-9666-757654e9c35e","Type":"ContainerStarted","Data":"baf31e0018548ce23470fd372aed90f69f405884c0594cc4cd64bd69dc859f86"} Mar 08 03:25:38.755921 master-0 kubenswrapper[13046]: I0308 03:25:38.754787 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:39.754953 master-0 kubenswrapper[13046]: I0308 03:25:39.754906 13046 patch_prober.go:28] interesting pod/oauth-openshift-5fff964fcf-s5mbf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.90:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:25:39.755533 master-0 kubenswrapper[13046]: I0308 03:25:39.755506 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.90:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:25:40.016693 master-0 kubenswrapper[13046]: I0308 03:25:40.004622 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt"] Mar 08 03:25:40.025169 master-0 kubenswrapper[13046]: I0308 03:25:40.023238 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5"] Mar 08 03:25:40.025169 master-0 kubenswrapper[13046]: W0308 03:25:40.023347 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod268919d0_afa6_48ed_a6cb_3f558fc78b5d.slice/crio-838771668fa6a6886510c4d57741c27fb3b01879c764995f0bb071b28966178c WatchSource:0}: Error finding container 838771668fa6a6886510c4d57741c27fb3b01879c764995f0bb071b28966178c: Status 404 returned error can't find the container with id 838771668fa6a6886510c4d57741c27fb3b01879c764995f0bb071b28966178c Mar 08 03:25:40.095374 master-0 kubenswrapper[13046]: I0308 03:25:40.094885 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:25:40.184972 master-0 kubenswrapper[13046]: I0308 03:25:40.184185 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:25:40.237375 master-0 kubenswrapper[13046]: I0308 03:25:40.235391 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" podStartSLOduration=4.128608708 podStartE2EDuration="8.235371979s" podCreationTimestamp="2026-03-08 03:25:32 +0000 UTC" firstStartedPulling="2026-03-08 03:25:33.617842742 +0000 UTC m=+735.696609959" lastFinishedPulling="2026-03-08 03:25:37.724606012 +0000 UTC m=+739.803373230" observedRunningTime="2026-03-08 03:25:40.226904254 +0000 UTC m=+742.305671481" watchObservedRunningTime="2026-03-08 03:25:40.235371979 +0000 UTC m=+742.314139196" Mar 08 03:25:40.362049 master-0 kubenswrapper[13046]: I0308 03:25:40.359846 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.371270 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378280 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378427 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378556 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378714 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378876 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378978 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-hrrx2" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.378562 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 03:25:40.380179 master-0 kubenswrapper[13046]: I0308 03:25:40.379144 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 03:25:40.383548 master-0 kubenswrapper[13046]: I0308 03:25:40.383476 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:25:40.386234 master-0 kubenswrapper[13046]: I0308 03:25:40.385892 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 03:25:40.472043 master-0 kubenswrapper[13046]: I0308 03:25:40.472002 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472212 master-0 kubenswrapper[13046]: I0308 03:25:40.472044 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472212 master-0 kubenswrapper[13046]: I0308 03:25:40.472096 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms88d\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472212 master-0 kubenswrapper[13046]: I0308 03:25:40.472114 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472212 master-0 kubenswrapper[13046]: I0308 03:25:40.472144 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472212 master-0 kubenswrapper[13046]: I0308 03:25:40.472199 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472221 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472250 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472298 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472322 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472342 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.472371 master-0 kubenswrapper[13046]: I0308 03:25:40.472362 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575256 master-0 kubenswrapper[13046]: I0308 03:25:40.575167 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575256 master-0 kubenswrapper[13046]: I0308 03:25:40.575237 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575392 master-0 kubenswrapper[13046]: I0308 03:25:40.575274 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575392 master-0 kubenswrapper[13046]: I0308 03:25:40.575318 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms88d\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575392 master-0 kubenswrapper[13046]: I0308 03:25:40.575344 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575392 master-0 kubenswrapper[13046]: I0308 03:25:40.575385 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575427 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575456 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575495 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575534 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575564 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.575620 master-0 kubenswrapper[13046]: I0308 03:25:40.575595 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.577693 master-0 kubenswrapper[13046]: I0308 03:25:40.577567 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.577785 master-0 kubenswrapper[13046]: I0308 03:25:40.577737 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.578256 master-0 kubenswrapper[13046]: I0308 03:25:40.578229 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.582612 master-0 kubenswrapper[13046]: I0308 03:25:40.582336 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.584097 master-0 kubenswrapper[13046]: I0308 03:25:40.584064 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.587402 master-0 kubenswrapper[13046]: I0308 03:25:40.586903 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.587402 master-0 kubenswrapper[13046]: I0308 03:25:40.586907 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.587985 master-0 kubenswrapper[13046]: I0308 03:25:40.587701 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.587985 master-0 kubenswrapper[13046]: I0308 03:25:40.587732 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.587985 master-0 kubenswrapper[13046]: I0308 03:25:40.587896 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.589762 master-0 kubenswrapper[13046]: I0308 03:25:40.589719 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.599390 master-0 kubenswrapper[13046]: I0308 03:25:40.599324 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms88d\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d\") pod \"alertmanager-main-0\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.772754 master-0 kubenswrapper[13046]: I0308 03:25:40.772707 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:25:40.780815 master-0 kubenswrapper[13046]: I0308 03:25:40.780749 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" event={"ID":"268919d0-afa6-48ed-a6cb-3f558fc78b5d","Type":"ContainerStarted","Data":"177015841a2542e91e3f136233211b32722bcc5d10485252043e82332cef0946"} Mar 08 03:25:40.780815 master-0 kubenswrapper[13046]: I0308 03:25:40.780815 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" event={"ID":"268919d0-afa6-48ed-a6cb-3f558fc78b5d","Type":"ContainerStarted","Data":"c31c682b2989006ee6064b1c6c83c7e4b3199ed38a9d439b99d1825475b7f4de"} Mar 08 03:25:40.781890 master-0 kubenswrapper[13046]: I0308 03:25:40.780829 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" event={"ID":"268919d0-afa6-48ed-a6cb-3f558fc78b5d","Type":"ContainerStarted","Data":"838771668fa6a6886510c4d57741c27fb3b01879c764995f0bb071b28966178c"} Mar 08 03:25:40.783961 master-0 kubenswrapper[13046]: I0308 03:25:40.783872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" event={"ID":"2ea520cd-5fd4-4354-8cbb-38539cbef506","Type":"ContainerStarted","Data":"2a2187d651c190c9a63e35e8cb29d438f84ffa7bf06877cb1eea60cf3449fff5"} Mar 08 03:25:41.280504 master-0 kubenswrapper[13046]: I0308 03:25:41.278090 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:25:41.435425 master-0 kubenswrapper[13046]: I0308 03:25:41.435335 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-77897b758b-4ff46"] Mar 08 03:25:41.437220 master-0 kubenswrapper[13046]: I0308 03:25:41.437145 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.439616 master-0 kubenswrapper[13046]: I0308 03:25:41.439391 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 08 03:25:41.444094 master-0 kubenswrapper[13046]: I0308 03:25:41.442552 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 08 03:25:41.444094 master-0 kubenswrapper[13046]: I0308 03:25:41.442825 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 08 03:25:41.444094 master-0 kubenswrapper[13046]: I0308 03:25:41.443101 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 08 03:25:41.491739 master-0 kubenswrapper[13046]: I0308 03:25:41.444624 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 08 03:25:41.491739 master-0 kubenswrapper[13046]: I0308 03:25:41.449625 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 08 03:25:41.492973 master-0 kubenswrapper[13046]: I0308 03:25:41.492939 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-7ndsq" Mar 08 03:25:41.512279 master-0 kubenswrapper[13046]: I0308 03:25:41.510552 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-77897b758b-4ff46"] Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598679 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-serving-certs-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598726 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598755 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598773 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598789 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598810 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-metrics-client-ca\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598831 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfqh\" (UniqueName: \"kubernetes.io/projected/2f3aa698-2f96-4668-94ff-f287305790c7-kube-api-access-ctfqh\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.600980 master-0 kubenswrapper[13046]: I0308 03:25:41.598869 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-federate-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.703653 master-0 kubenswrapper[13046]: I0308 03:25:41.703609 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-federate-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.704030 master-0 kubenswrapper[13046]: I0308 03:25:41.704012 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-serving-certs-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.704133 master-0 kubenswrapper[13046]: I0308 03:25:41.704121 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.714505 master-0 kubenswrapper[13046]: I0308 03:25:41.704214 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.714908 master-0 kubenswrapper[13046]: I0308 03:25:41.714879 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.714997 master-0 kubenswrapper[13046]: I0308 03:25:41.714983 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.715103 master-0 kubenswrapper[13046]: I0308 03:25:41.715087 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-metrics-client-ca\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.715204 master-0 kubenswrapper[13046]: I0308 03:25:41.715191 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfqh\" (UniqueName: \"kubernetes.io/projected/2f3aa698-2f96-4668-94ff-f287305790c7-kube-api-access-ctfqh\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.715560 master-0 kubenswrapper[13046]: I0308 03:25:41.705911 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f"] Mar 08 03:25:41.716453 master-0 kubenswrapper[13046]: I0308 03:25:41.716435 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f"] Mar 08 03:25:41.716633 master-0 kubenswrapper[13046]: I0308 03:25:41.716619 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:41.720513 master-0 kubenswrapper[13046]: I0308 03:25:41.708570 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-federate-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.720744 master-0 kubenswrapper[13046]: I0308 03:25:41.718763 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-client-tls\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.720975 master-0 kubenswrapper[13046]: I0308 03:25:41.708108 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.721054 master-0 kubenswrapper[13046]: I0308 03:25:41.718927 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.721133 master-0 kubenswrapper[13046]: I0308 03:25:41.719627 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-metrics-client-ca\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.721269 master-0 kubenswrapper[13046]: I0308 03:25:41.719637 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f3aa698-2f96-4668-94ff-f287305790c7-serving-certs-ca-bundle\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.721476 master-0 kubenswrapper[13046]: I0308 03:25:41.720640 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-hlcg2" Mar 08 03:25:41.721570 master-0 kubenswrapper[13046]: I0308 03:25:41.720674 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 03:25:41.737623 master-0 kubenswrapper[13046]: I0308 03:25:41.733058 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2f3aa698-2f96-4668-94ff-f287305790c7-secret-telemeter-client\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.747323 master-0 kubenswrapper[13046]: I0308 03:25:41.747295 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfqh\" (UniqueName: \"kubernetes.io/projected/2f3aa698-2f96-4668-94ff-f287305790c7-kube-api-access-ctfqh\") pod \"telemeter-client-77897b758b-4ff46\" (UID: \"2f3aa698-2f96-4668-94ff-f287305790c7\") " pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.795963 master-0 kubenswrapper[13046]: I0308 03:25:41.795697 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"0cd6f3481c6d84187e23be93062fbf80fc34bdc67d26a48fc364b934826d50e4"} Mar 08 03:25:41.817749 master-0 kubenswrapper[13046]: I0308 03:25:41.816426 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/00a324ea-209d-4b0c-86af-3058436a291a-monitoring-plugin-cert\") pod \"monitoring-plugin-d9c677fbc-vwt7f\" (UID: \"00a324ea-209d-4b0c-86af-3058436a291a\") " pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:41.817749 master-0 kubenswrapper[13046]: I0308 03:25:41.816690 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" Mar 08 03:25:41.918117 master-0 kubenswrapper[13046]: I0308 03:25:41.917996 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/00a324ea-209d-4b0c-86af-3058436a291a-monitoring-plugin-cert\") pod \"monitoring-plugin-d9c677fbc-vwt7f\" (UID: \"00a324ea-209d-4b0c-86af-3058436a291a\") " pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:41.922212 master-0 kubenswrapper[13046]: I0308 03:25:41.922173 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/00a324ea-209d-4b0c-86af-3058436a291a-monitoring-plugin-cert\") pod \"monitoring-plugin-d9c677fbc-vwt7f\" (UID: \"00a324ea-209d-4b0c-86af-3058436a291a\") " pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:42.953294 master-0 kubenswrapper[13046]: I0308 03:25:42.953249 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:43.113000 master-0 kubenswrapper[13046]: I0308 03:25:43.110040 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:25:43.113000 master-0 kubenswrapper[13046]: I0308 03:25:43.111545 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.121068 master-0 kubenswrapper[13046]: I0308 03:25:43.121025 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 03:25:43.121256 master-0 kubenswrapper[13046]: I0308 03:25:43.121217 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-766946c477-ghtbv"] Mar 08 03:25:43.128882 master-0 kubenswrapper[13046]: I0308 03:25:43.128832 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.130313 master-0 kubenswrapper[13046]: I0308 03:25:43.130279 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 03:25:43.131359 master-0 kubenswrapper[13046]: I0308 03:25:43.131334 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 03:25:43.133189 master-0 kubenswrapper[13046]: I0308 03:25:43.133165 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 03:25:43.133336 master-0 kubenswrapper[13046]: I0308 03:25:43.133310 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 03:25:43.133462 master-0 kubenswrapper[13046]: I0308 03:25:43.133427 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-aj3ujg6d4dtk9" Mar 08 03:25:43.135002 master-0 kubenswrapper[13046]: I0308 03:25:43.134974 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:25:43.135698 master-0 kubenswrapper[13046]: I0308 03:25:43.135676 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-qr9xw" Mar 08 03:25:43.140048 master-0 kubenswrapper[13046]: I0308 03:25:43.139981 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-766946c477-ghtbv"] Mar 08 03:25:43.203519 master-0 kubenswrapper[13046]: I0308 03:25:43.203432 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-688d9d49f9-nnqwz"] Mar 08 03:25:43.207852 master-0 kubenswrapper[13046]: I0308 03:25:43.207807 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.211885 master-0 kubenswrapper[13046]: I0308 03:25:43.211832 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 03:25:43.212070 master-0 kubenswrapper[13046]: I0308 03:25:43.212046 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-69f8r" Mar 08 03:25:43.212344 master-0 kubenswrapper[13046]: I0308 03:25:43.212320 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 03:25:43.212583 master-0 kubenswrapper[13046]: I0308 03:25:43.212547 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 03:25:43.212619 master-0 kubenswrapper[13046]: I0308 03:25:43.212605 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 03:25:43.212735 master-0 kubenswrapper[13046]: I0308 03:25:43.212713 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 03:25:43.212783 master-0 kubenswrapper[13046]: I0308 03:25:43.212767 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-atdhv16h3mv90" Mar 08 03:25:43.237554 master-0 kubenswrapper[13046]: I0308 03:25:43.236171 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-688d9d49f9-nnqwz"] Mar 08 03:25:43.241141 master-0 kubenswrapper[13046]: I0308 03:25:43.240315 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rw6b\" (UniqueName: \"kubernetes.io/projected/1893fc9c-7c29-4674-8011-f046dd63a08b-kube-api-access-9rw6b\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.241141 master-0 kubenswrapper[13046]: I0308 03:25:43.240379 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-metrics-server-audit-profiles\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.241141 master-0 kubenswrapper[13046]: I0308 03:25:43.240418 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-server-tls\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.241141 master-0 kubenswrapper[13046]: I0308 03:25:43.240444 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.241141 master-0 kubenswrapper[13046]: I0308 03:25:43.240463 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-client-certs\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.244617 master-0 kubenswrapper[13046]: I0308 03:25:43.244595 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-client-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.244718 master-0 kubenswrapper[13046]: I0308 03:25:43.244643 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.244718 master-0 kubenswrapper[13046]: I0308 03:25:43.244709 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.244781 master-0 kubenswrapper[13046]: I0308 03:25:43.244728 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wdn\" (UniqueName: \"kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.244781 master-0 kubenswrapper[13046]: I0308 03:25:43.244762 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.244873 master-0 kubenswrapper[13046]: I0308 03:25:43.244781 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1893fc9c-7c29-4674-8011-f046dd63a08b-audit-log\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.244916 master-0 kubenswrapper[13046]: I0308 03:25:43.244865 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.244947 master-0 kubenswrapper[13046]: I0308 03:25:43.244933 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.244979 master-0 kubenswrapper[13046]: I0308 03:25:43.244966 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346081 master-0 kubenswrapper[13046]: I0308 03:25:43.346018 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-client-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346081 master-0 kubenswrapper[13046]: I0308 03:25:43.346065 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346081 master-0 kubenswrapper[13046]: I0308 03:25:43.346089 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346081 master-0 kubenswrapper[13046]: I0308 03:25:43.346106 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wdn\" (UniqueName: \"kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346081 master-0 kubenswrapper[13046]: I0308 03:25:43.346128 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346151 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-grpc-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346180 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1893fc9c-7c29-4674-8011-f046dd63a08b-audit-log\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346198 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3311dbb-cd30-4cd9-9f18-d360521bec39-metrics-client-ca\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346218 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346234 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346255 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346287 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346305 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346325 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346341 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346372 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346391 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rw6b\" (UniqueName: \"kubernetes.io/projected/1893fc9c-7c29-4674-8011-f046dd63a08b-kube-api-access-9rw6b\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346415 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-metrics-server-audit-profiles\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346423 master-0 kubenswrapper[13046]: I0308 03:25:43.346434 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfczz\" (UniqueName: \"kubernetes.io/projected/b3311dbb-cd30-4cd9-9f18-d360521bec39-kube-api-access-tfczz\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.346832 master-0 kubenswrapper[13046]: I0308 03:25:43.346460 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-server-tls\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.346832 master-0 kubenswrapper[13046]: I0308 03:25:43.346499 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.346832 master-0 kubenswrapper[13046]: I0308 03:25:43.346516 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-client-certs\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.349346 master-0 kubenswrapper[13046]: I0308 03:25:43.349284 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-client-certs\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.352300 master-0 kubenswrapper[13046]: I0308 03:25:43.351530 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-client-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.352300 master-0 kubenswrapper[13046]: I0308 03:25:43.352206 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.353058 master-0 kubenswrapper[13046]: I0308 03:25:43.353027 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.358084 master-0 kubenswrapper[13046]: I0308 03:25:43.358052 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.358551 master-0 kubenswrapper[13046]: I0308 03:25:43.358524 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1893fc9c-7c29-4674-8011-f046dd63a08b-audit-log\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.360903 master-0 kubenswrapper[13046]: I0308 03:25:43.360872 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.361576 master-0 kubenswrapper[13046]: I0308 03:25:43.361536 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.362756 master-0 kubenswrapper[13046]: I0308 03:25:43.362253 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.366040 master-0 kubenswrapper[13046]: I0308 03:25:43.366016 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1893fc9c-7c29-4674-8011-f046dd63a08b-metrics-server-audit-profiles\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.368390 master-0 kubenswrapper[13046]: I0308 03:25:43.368368 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1893fc9c-7c29-4674-8011-f046dd63a08b-secret-metrics-server-tls\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.369460 master-0 kubenswrapper[13046]: I0308 03:25:43.369431 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.371460 master-0 kubenswrapper[13046]: I0308 03:25:43.371430 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wdn\" (UniqueName: \"kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn\") pod \"console-668dfc897d-db2r2\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.375421 master-0 kubenswrapper[13046]: I0308 03:25:43.375401 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rw6b\" (UniqueName: \"kubernetes.io/projected/1893fc9c-7c29-4674-8011-f046dd63a08b-kube-api-access-9rw6b\") pod \"metrics-server-766946c477-ghtbv\" (UID: \"1893fc9c-7c29-4674-8011-f046dd63a08b\") " pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.447680 master-0 kubenswrapper[13046]: I0308 03:25:43.447641 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448248 master-0 kubenswrapper[13046]: I0308 03:25:43.448226 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448308 master-0 kubenswrapper[13046]: I0308 03:25:43.448272 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfczz\" (UniqueName: \"kubernetes.io/projected/b3311dbb-cd30-4cd9-9f18-d360521bec39-kube-api-access-tfczz\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448349 master-0 kubenswrapper[13046]: I0308 03:25:43.448331 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-grpc-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448452 master-0 kubenswrapper[13046]: I0308 03:25:43.448350 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3311dbb-cd30-4cd9-9f18-d360521bec39-metrics-client-ca\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448452 master-0 kubenswrapper[13046]: I0308 03:25:43.448370 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448452 master-0 kubenswrapper[13046]: I0308 03:25:43.448389 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.448452 master-0 kubenswrapper[13046]: I0308 03:25:43.448407 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.450674 master-0 kubenswrapper[13046]: I0308 03:25:43.450642 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3311dbb-cd30-4cd9-9f18-d360521bec39-metrics-client-ca\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.453120 master-0 kubenswrapper[13046]: I0308 03:25:43.452932 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.453397 master-0 kubenswrapper[13046]: I0308 03:25:43.453359 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-grpc-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.453889 master-0 kubenswrapper[13046]: I0308 03:25:43.453833 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.460125 master-0 kubenswrapper[13046]: I0308 03:25:43.460093 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-tls\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.460195 master-0 kubenswrapper[13046]: I0308 03:25:43.460123 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.460389 master-0 kubenswrapper[13046]: I0308 03:25:43.460352 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b3311dbb-cd30-4cd9-9f18-d360521bec39-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.470945 master-0 kubenswrapper[13046]: I0308 03:25:43.470915 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfczz\" (UniqueName: \"kubernetes.io/projected/b3311dbb-cd30-4cd9-9f18-d360521bec39-kube-api-access-tfczz\") pod \"thanos-querier-688d9d49f9-nnqwz\" (UID: \"b3311dbb-cd30-4cd9-9f18-d360521bec39\") " pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.512737 master-0 kubenswrapper[13046]: I0308 03:25:43.512663 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:43.558616 master-0 kubenswrapper[13046]: I0308 03:25:43.557988 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:25:43.595376 master-0 kubenswrapper[13046]: I0308 03:25:43.594981 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:43.598361 master-0 kubenswrapper[13046]: I0308 03:25:43.598174 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f"] Mar 08 03:25:43.751040 master-0 kubenswrapper[13046]: I0308 03:25:43.751010 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-77897b758b-4ff46"] Mar 08 03:25:43.811960 master-0 kubenswrapper[13046]: I0308 03:25:43.811908 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rtdkr" event={"ID":"23651d40-89da-46c4-a6cb-b4c031e826cb","Type":"ContainerStarted","Data":"836cfa558f4a1972c42feab8f1b4c7b1c8096a5f0e646af641d4a1fcbcb53e64"} Mar 08 03:25:44.049660 master-0 kubenswrapper[13046]: W0308 03:25:44.049407 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a324ea_209d_4b0c_86af_3058436a291a.slice/crio-c5b642ba903bd66a20c1f170e466e7a75add6f5d59fea7f8a7a172b7ff0805a5 WatchSource:0}: Error finding container c5b642ba903bd66a20c1f170e466e7a75add6f5d59fea7f8a7a172b7ff0805a5: Status 404 returned error can't find the container with id c5b642ba903bd66a20c1f170e466e7a75add6f5d59fea7f8a7a172b7ff0805a5 Mar 08 03:25:44.819181 master-0 kubenswrapper[13046]: I0308 03:25:44.819069 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" event={"ID":"2f3aa698-2f96-4668-94ff-f287305790c7","Type":"ContainerStarted","Data":"2657da9d5537cf92c05ecbd6e6dae0334584f91f3a8688b50b3c5158a057dc8e"} Mar 08 03:25:44.820280 master-0 kubenswrapper[13046]: I0308 03:25:44.820245 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" event={"ID":"00a324ea-209d-4b0c-86af-3058436a291a","Type":"ContainerStarted","Data":"c5b642ba903bd66a20c1f170e466e7a75add6f5d59fea7f8a7a172b7ff0805a5"} Mar 08 03:25:44.821864 master-0 kubenswrapper[13046]: I0308 03:25:44.821839 13046 generic.go:334] "Generic (PLEG): container finished" podID="23651d40-89da-46c4-a6cb-b4c031e826cb" containerID="836cfa558f4a1972c42feab8f1b4c7b1c8096a5f0e646af641d4a1fcbcb53e64" exitCode=0 Mar 08 03:25:44.821940 master-0 kubenswrapper[13046]: I0308 03:25:44.821874 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rtdkr" event={"ID":"23651d40-89da-46c4-a6cb-b4c031e826cb","Type":"ContainerDied","Data":"836cfa558f4a1972c42feab8f1b4c7b1c8096a5f0e646af641d4a1fcbcb53e64"} Mar 08 03:25:46.200114 master-0 kubenswrapper[13046]: I0308 03:25:46.199874 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:25:46.205667 master-0 kubenswrapper[13046]: I0308 03:25:46.205625 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.209674 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.209877 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-xfc6f" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.210425 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1allgkpdij0ou" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.210699 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.210851 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.210984 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.211641 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.211813 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.211910 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 03:25:46.213502 master-0 kubenswrapper[13046]: I0308 03:25:46.212109 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 03:25:46.217524 master-0 kubenswrapper[13046]: I0308 03:25:46.214691 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 03:25:46.217524 master-0 kubenswrapper[13046]: I0308 03:25:46.217049 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 03:25:46.221527 master-0 kubenswrapper[13046]: I0308 03:25:46.218265 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 03:25:46.241301 master-0 kubenswrapper[13046]: I0308 03:25:46.239237 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270254 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270308 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270337 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270362 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270392 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270455 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270474 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270508 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270528 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270555 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270593 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxj4\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270645 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270666 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270683 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270717 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270735 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270751 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.271629 master-0 kubenswrapper[13046]: I0308 03:25:46.270770 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.309189 master-0 kubenswrapper[13046]: I0308 03:25:46.307289 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:25:46.372500 master-0 kubenswrapper[13046]: I0308 03:25:46.372419 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.372687 master-0 kubenswrapper[13046]: I0308 03:25:46.372574 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.372810 master-0 kubenswrapper[13046]: I0308 03:25:46.372762 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373395 master-0 kubenswrapper[13046]: I0308 03:25:46.373357 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373574 master-0 kubenswrapper[13046]: I0308 03:25:46.373528 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373624 master-0 kubenswrapper[13046]: I0308 03:25:46.373602 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373670 master-0 kubenswrapper[13046]: I0308 03:25:46.373634 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373670 master-0 kubenswrapper[13046]: I0308 03:25:46.373663 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373747 master-0 kubenswrapper[13046]: I0308 03:25:46.373739 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373791 master-0 kubenswrapper[13046]: I0308 03:25:46.373771 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373835 master-0 kubenswrapper[13046]: I0308 03:25:46.373804 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373835 master-0 kubenswrapper[13046]: I0308 03:25:46.373833 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373918 master-0 kubenswrapper[13046]: I0308 03:25:46.373896 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373955 master-0 kubenswrapper[13046]: I0308 03:25:46.373917 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.373955 master-0 kubenswrapper[13046]: I0308 03:25:46.373941 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.374034 master-0 kubenswrapper[13046]: I0308 03:25:46.373963 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.374034 master-0 kubenswrapper[13046]: I0308 03:25:46.373997 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.374115 master-0 kubenswrapper[13046]: I0308 03:25:46.374040 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.374115 master-0 kubenswrapper[13046]: I0308 03:25:46.374085 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxj4\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.374590 master-0 kubenswrapper[13046]: I0308 03:25:46.374541 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.375236 master-0 kubenswrapper[13046]: I0308 03:25:46.375196 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.376448 master-0 kubenswrapper[13046]: I0308 03:25:46.376420 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.376749 master-0 kubenswrapper[13046]: I0308 03:25:46.376724 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.378679 master-0 kubenswrapper[13046]: I0308 03:25:46.377741 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.378873 master-0 kubenswrapper[13046]: I0308 03:25:46.378760 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.379941 master-0 kubenswrapper[13046]: I0308 03:25:46.379913 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.381254 master-0 kubenswrapper[13046]: I0308 03:25:46.381217 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.381312 master-0 kubenswrapper[13046]: I0308 03:25:46.381249 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.381825 master-0 kubenswrapper[13046]: I0308 03:25:46.381792 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.382041 master-0 kubenswrapper[13046]: I0308 03:25:46.382006 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.382301 master-0 kubenswrapper[13046]: I0308 03:25:46.382237 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.382671 master-0 kubenswrapper[13046]: I0308 03:25:46.382609 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.395006 master-0 kubenswrapper[13046]: I0308 03:25:46.394971 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.395328 master-0 kubenswrapper[13046]: I0308 03:25:46.395311 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.397191 master-0 kubenswrapper[13046]: I0308 03:25:46.397164 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.397537 master-0 kubenswrapper[13046]: I0308 03:25:46.397517 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxj4\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4\") pod \"prometheus-k8s-0\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:46.609866 master-0 kubenswrapper[13046]: I0308 03:25:46.609289 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:25:47.606283 master-0 kubenswrapper[13046]: I0308 03:25:47.606228 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:25:47.607671 master-0 kubenswrapper[13046]: I0308 03:25:47.607647 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.610040 master-0 kubenswrapper[13046]: I0308 03:25:47.609978 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 03:25:47.611523 master-0 kubenswrapper[13046]: I0308 03:25:47.610604 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-gcdsk" Mar 08 03:25:47.618841 master-0 kubenswrapper[13046]: I0308 03:25:47.618033 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:25:47.708182 master-0 kubenswrapper[13046]: I0308 03:25:47.708133 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.708383 master-0 kubenswrapper[13046]: I0308 03:25:47.708203 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.708383 master-0 kubenswrapper[13046]: I0308 03:25:47.708238 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.810067 master-0 kubenswrapper[13046]: I0308 03:25:47.810018 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.810272 master-0 kubenswrapper[13046]: I0308 03:25:47.810121 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.810272 master-0 kubenswrapper[13046]: I0308 03:25:47.810163 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.810382 master-0 kubenswrapper[13046]: I0308 03:25:47.810317 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.810443 master-0 kubenswrapper[13046]: I0308 03:25:47.810401 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.841260 master-0 kubenswrapper[13046]: I0308 03:25:47.841018 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.927903 master-0 kubenswrapper[13046]: I0308 03:25:47.927794 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:25:47.976326 master-0 kubenswrapper[13046]: W0308 03:25:47.976264 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod344be988_4e0e_46c9_9ba9_a84e76abe7bc.slice/crio-245ea35bfc2f9c3196ff568e863b40afee8588a6a70a8ce6270746ae95ff79e0 WatchSource:0}: Error finding container 245ea35bfc2f9c3196ff568e863b40afee8588a6a70a8ce6270746ae95ff79e0: Status 404 returned error can't find the container with id 245ea35bfc2f9c3196ff568e863b40afee8588a6a70a8ce6270746ae95ff79e0 Mar 08 03:25:48.401958 master-0 kubenswrapper[13046]: I0308 03:25:48.401908 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-688d9d49f9-nnqwz"] Mar 08 03:25:48.869502 master-0 kubenswrapper[13046]: I0308 03:25:48.869414 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"03c6dfe28f8921b60d4115d7a6a4bb982d6b0f53f0aa5ec20e7c01def592c517"} Mar 08 03:25:48.871604 master-0 kubenswrapper[13046]: I0308 03:25:48.871517 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668dfc897d-db2r2" event={"ID":"344be988-4e0e-46c9-9ba9-a84e76abe7bc","Type":"ContainerStarted","Data":"245ea35bfc2f9c3196ff568e863b40afee8588a6a70a8ce6270746ae95ff79e0"} Mar 08 03:25:48.914041 master-0 kubenswrapper[13046]: I0308 03:25:48.913966 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-766946c477-ghtbv"] Mar 08 03:25:50.145812 master-0 kubenswrapper[13046]: I0308 03:25:50.145764 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-772zs_23b66415-df37-4015-9a0c-69115b3a0739/multus-admission-controller/0.log" Mar 08 03:25:50.146644 master-0 kubenswrapper[13046]: I0308 03:25:50.146580 13046 generic.go:334] "Generic (PLEG): container finished" podID="23b66415-df37-4015-9a0c-69115b3a0739" containerID="bf7afb690bf11b8a7c9ce9f568adbdaaa57866a3aff5ced1711ca0a11620089f" exitCode=137 Mar 08 03:25:50.147084 master-0 kubenswrapper[13046]: I0308 03:25:50.146677 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerDied","Data":"bf7afb690bf11b8a7c9ce9f568adbdaaa57866a3aff5ced1711ca0a11620089f"} Mar 08 03:25:50.149993 master-0 kubenswrapper[13046]: I0308 03:25:50.149960 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" event={"ID":"1893fc9c-7c29-4674-8011-f046dd63a08b","Type":"ContainerStarted","Data":"34d6004e0d3f45e1c31fa733bae6b77198e1c20d3c6d6acfff58a0b3678c9f43"} Mar 08 03:25:50.545666 master-0 kubenswrapper[13046]: I0308 03:25:50.545638 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-772zs_23b66415-df37-4015-9a0c-69115b3a0739/multus-admission-controller/0.log" Mar 08 03:25:50.545743 master-0 kubenswrapper[13046]: I0308 03:25:50.545705 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:25:50.620426 master-0 kubenswrapper[13046]: I0308 03:25:50.618364 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:25:50.642167 master-0 kubenswrapper[13046]: I0308 03:25:50.641801 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:25:50.642289 master-0 kubenswrapper[13046]: I0308 03:25:50.642246 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") pod \"23b66415-df37-4015-9a0c-69115b3a0739\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " Mar 08 03:25:50.642533 master-0 kubenswrapper[13046]: I0308 03:25:50.642516 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") pod \"23b66415-df37-4015-9a0c-69115b3a0739\" (UID: \"23b66415-df37-4015-9a0c-69115b3a0739\") " Mar 08 03:25:50.645701 master-0 kubenswrapper[13046]: I0308 03:25:50.645650 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc" (OuterVolumeSpecName: "kube-api-access-4nrpc") pod "23b66415-df37-4015-9a0c-69115b3a0739" (UID: "23b66415-df37-4015-9a0c-69115b3a0739"). InnerVolumeSpecName "kube-api-access-4nrpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:50.645770 master-0 kubenswrapper[13046]: I0308 03:25:50.645689 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "23b66415-df37-4015-9a0c-69115b3a0739" (UID: "23b66415-df37-4015-9a0c-69115b3a0739"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:25:50.680909 master-0 kubenswrapper[13046]: W0308 03:25:50.680871 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb36ba129_ccfe_4dfc_889a_aa98d4dece4a.slice/crio-b972771ed331a46f9d80ea803f2110481c6c95662feba692274813259f88d537 WatchSource:0}: Error finding container b972771ed331a46f9d80ea803f2110481c6c95662feba692274813259f88d537: Status 404 returned error can't find the container with id b972771ed331a46f9d80ea803f2110481c6c95662feba692274813259f88d537 Mar 08 03:25:50.744412 master-0 kubenswrapper[13046]: I0308 03:25:50.744376 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nrpc\" (UniqueName: \"kubernetes.io/projected/23b66415-df37-4015-9a0c-69115b3a0739-kube-api-access-4nrpc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:50.744412 master-0 kubenswrapper[13046]: I0308 03:25:50.744405 13046 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23b66415-df37-4015-9a0c-69115b3a0739-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:51.173242 master-0 kubenswrapper[13046]: I0308 03:25:51.173209 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c" exitCode=0 Mar 08 03:25:51.182807 master-0 kubenswrapper[13046]: I0308 03:25:51.173256 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c"} Mar 08 03:25:51.186347 master-0 kubenswrapper[13046]: I0308 03:25:51.186264 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" event={"ID":"268919d0-afa6-48ed-a6cb-3f558fc78b5d","Type":"ContainerStarted","Data":"a863d949801318fc3f76587a73b25958e77984b88d88e1ce62ab2a4f6f4a8748"} Mar 08 03:25:51.197226 master-0 kubenswrapper[13046]: I0308 03:25:51.197189 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"b972771ed331a46f9d80ea803f2110481c6c95662feba692274813259f88d537"} Mar 08 03:25:51.202355 master-0 kubenswrapper[13046]: I0308 03:25:51.202323 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" event={"ID":"2ea520cd-5fd4-4354-8cbb-38539cbef506","Type":"ContainerStarted","Data":"48854612cff0cfef3b7d93b56078818dca3b0d9652c4ca3ecc4bec5cebab7662"} Mar 08 03:25:51.202440 master-0 kubenswrapper[13046]: I0308 03:25:51.202360 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" event={"ID":"2ea520cd-5fd4-4354-8cbb-38539cbef506","Type":"ContainerStarted","Data":"70bdd5de4102ea7581ed5da4ea627caf24ebd5525577df6efbb18b9d60ddc64b"} Mar 08 03:25:51.211028 master-0 kubenswrapper[13046]: I0308 03:25:51.210988 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" event={"ID":"00a324ea-209d-4b0c-86af-3058436a291a","Type":"ContainerStarted","Data":"f918da520a93d89b5da6762ed089950100725bfae4e7f1be69a5d1b3b4a0e846"} Mar 08 03:25:51.211294 master-0 kubenswrapper[13046]: I0308 03:25:51.211265 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:51.213629 master-0 kubenswrapper[13046]: I0308 03:25:51.213501 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b","Type":"ContainerStarted","Data":"02f2d6e6b199f8378e5bd76b83e4837ea3acf83f6c358c0f384995a8a9484b9c"} Mar 08 03:25:51.215699 master-0 kubenswrapper[13046]: I0308 03:25:51.215660 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc7d49677-gsx8f" event={"ID":"158946fc-eae1-4823-a93c-398d4aede495","Type":"ContainerStarted","Data":"b4d6c4c4750abb273febb1402324cd536599af35d258e2f54378387537d6e101"} Mar 08 03:25:51.219074 master-0 kubenswrapper[13046]: I0308 03:25:51.218991 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-772zs_23b66415-df37-4015-9a0c-69115b3a0739/multus-admission-controller/0.log" Mar 08 03:25:51.219127 master-0 kubenswrapper[13046]: I0308 03:25:51.219093 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" Mar 08 03:25:51.219235 master-0 kubenswrapper[13046]: I0308 03:25:51.219211 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-772zs" event={"ID":"23b66415-df37-4015-9a0c-69115b3a0739","Type":"ContainerDied","Data":"04168952ada741f79304ee9b25e1212567fc1ce3d719a0050a26b711accbbea4"} Mar 08 03:25:51.219288 master-0 kubenswrapper[13046]: I0308 03:25:51.219262 13046 scope.go:117] "RemoveContainer" containerID="29c6ed5b13bfb915384e6141f8cbf16cba543eb6524f87e0bd97e324ceae1c63" Mar 08 03:25:51.219883 master-0 kubenswrapper[13046]: I0308 03:25:51.219864 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" Mar 08 03:25:51.222068 master-0 kubenswrapper[13046]: I0308 03:25:51.221746 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" event={"ID":"2f3aa698-2f96-4668-94ff-f287305790c7","Type":"ContainerStarted","Data":"e7b41487f737e711237c6e65089939e5d79d759f3975f165f3720fc4ef4e0896"} Mar 08 03:25:51.222120 master-0 kubenswrapper[13046]: I0308 03:25:51.222078 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" event={"ID":"2f3aa698-2f96-4668-94ff-f287305790c7","Type":"ContainerStarted","Data":"4232db79bdef6dcb4539ec28a1e261409f2fd4d83d68a4627c7bd439499c5376"} Mar 08 03:25:51.227580 master-0 kubenswrapper[13046]: I0308 03:25:51.226332 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rtdkr" event={"ID":"23651d40-89da-46c4-a6cb-b4c031e826cb","Type":"ContainerStarted","Data":"7cc71fdfba42b774b476a8a68a6bd355f6f0e08c02ca3799776c8524907828ba"} Mar 08 03:25:51.227580 master-0 kubenswrapper[13046]: I0308 03:25:51.226396 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rtdkr" event={"ID":"23651d40-89da-46c4-a6cb-b4c031e826cb","Type":"ContainerStarted","Data":"0b6128adf2d738ec9618869717d4ae7d2d640d7107f52e946e10ad5c9e8eaba4"} Mar 08 03:25:51.228658 master-0 kubenswrapper[13046]: I0308 03:25:51.228607 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668dfc897d-db2r2" event={"ID":"344be988-4e0e-46c9-9ba9-a84e76abe7bc","Type":"ContainerStarted","Data":"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725"} Mar 08 03:25:51.265772 master-0 kubenswrapper[13046]: I0308 03:25:51.265671 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-mcvbt" podStartSLOduration=8.314300406 podStartE2EDuration="16.26565321s" podCreationTimestamp="2026-03-08 03:25:35 +0000 UTC" firstStartedPulling="2026-03-08 03:25:40.590426452 +0000 UTC m=+742.669193669" lastFinishedPulling="2026-03-08 03:25:48.541779256 +0000 UTC m=+750.620546473" observedRunningTime="2026-03-08 03:25:51.257548926 +0000 UTC m=+753.336316153" watchObservedRunningTime="2026-03-08 03:25:51.26565321 +0000 UTC m=+753.344420427" Mar 08 03:25:51.316709 master-0 kubenswrapper[13046]: I0308 03:25:51.316676 13046 scope.go:117] "RemoveContainer" containerID="bf7afb690bf11b8a7c9ce9f568adbdaaa57866a3aff5ced1711ca0a11620089f" Mar 08 03:25:51.324598 master-0 kubenswrapper[13046]: I0308 03:25:51.324529 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-d9c677fbc-vwt7f" podStartSLOduration=4.817164803 podStartE2EDuration="10.324509312s" podCreationTimestamp="2026-03-08 03:25:41 +0000 UTC" firstStartedPulling="2026-03-08 03:25:44.051945035 +0000 UTC m=+746.130712252" lastFinishedPulling="2026-03-08 03:25:49.559289544 +0000 UTC m=+751.638056761" observedRunningTime="2026-03-08 03:25:51.274806884 +0000 UTC m=+753.353574101" watchObservedRunningTime="2026-03-08 03:25:51.324509312 +0000 UTC m=+753.403276529" Mar 08 03:25:51.325811 master-0 kubenswrapper[13046]: I0308 03:25:51.325775 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dc7d49677-gsx8f" podStartSLOduration=3.167849048 podStartE2EDuration="17.325766477s" podCreationTimestamp="2026-03-08 03:25:34 +0000 UTC" firstStartedPulling="2026-03-08 03:25:35.385982749 +0000 UTC m=+737.464749966" lastFinishedPulling="2026-03-08 03:25:49.543900188 +0000 UTC m=+751.622667395" observedRunningTime="2026-03-08 03:25:51.323290688 +0000 UTC m=+753.402057915" watchObservedRunningTime="2026-03-08 03:25:51.325766477 +0000 UTC m=+753.404533704" Mar 08 03:25:51.375189 master-0 kubenswrapper[13046]: I0308 03:25:51.375084 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rtdkr" podStartSLOduration=10.928264762 podStartE2EDuration="16.375056603s" podCreationTimestamp="2026-03-08 03:25:35 +0000 UTC" firstStartedPulling="2026-03-08 03:25:37.663813107 +0000 UTC m=+739.742580324" lastFinishedPulling="2026-03-08 03:25:43.110604948 +0000 UTC m=+745.189372165" observedRunningTime="2026-03-08 03:25:51.353024062 +0000 UTC m=+753.431791279" watchObservedRunningTime="2026-03-08 03:25:51.375056603 +0000 UTC m=+753.453823820" Mar 08 03:25:51.432365 master-0 kubenswrapper[13046]: I0308 03:25:51.432069 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-668dfc897d-db2r2" podStartSLOduration=6.317619545 podStartE2EDuration="8.432048093s" podCreationTimestamp="2026-03-08 03:25:43 +0000 UTC" firstStartedPulling="2026-03-08 03:25:47.999605646 +0000 UTC m=+750.078372863" lastFinishedPulling="2026-03-08 03:25:50.114034194 +0000 UTC m=+752.192801411" observedRunningTime="2026-03-08 03:25:51.392180588 +0000 UTC m=+753.470947795" watchObservedRunningTime="2026-03-08 03:25:51.432048093 +0000 UTC m=+753.510815310" Mar 08 03:25:51.448181 master-0 kubenswrapper[13046]: I0308 03:25:51.448058 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:25:51.452596 master-0 kubenswrapper[13046]: I0308 03:25:51.452544 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-772zs"] Mar 08 03:25:52.134373 master-0 kubenswrapper[13046]: I0308 03:25:52.134315 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b66415-df37-4015-9a0c-69115b3a0739" path="/var/lib/kubelet/pods/23b66415-df37-4015-9a0c-69115b3a0739/volumes" Mar 08 03:25:52.235138 master-0 kubenswrapper[13046]: I0308 03:25:52.235088 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" exitCode=0 Mar 08 03:25:52.235604 master-0 kubenswrapper[13046]: I0308 03:25:52.235151 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} Mar 08 03:25:52.254444 master-0 kubenswrapper[13046]: I0308 03:25:52.254386 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" event={"ID":"2ea520cd-5fd4-4354-8cbb-38539cbef506","Type":"ContainerStarted","Data":"89440949f3624a843119929b2b710c64d4437cdd61406f95e47fb68b604664cb"} Mar 08 03:25:52.260307 master-0 kubenswrapper[13046]: I0308 03:25:52.260256 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" event={"ID":"2f3aa698-2f96-4668-94ff-f287305790c7","Type":"ContainerStarted","Data":"a6603c6d357ee31630d3ce5794653d6232ccf637cc56031a86fbddf72e0c5c7d"} Mar 08 03:25:52.264430 master-0 kubenswrapper[13046]: I0308 03:25:52.264382 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b","Type":"ContainerStarted","Data":"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24"} Mar 08 03:25:52.313934 master-0 kubenswrapper[13046]: I0308 03:25:52.313855 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-t5lc5" podStartSLOduration=11.525705763 podStartE2EDuration="17.313836087s" podCreationTimestamp="2026-03-08 03:25:35 +0000 UTC" firstStartedPulling="2026-03-08 03:25:40.058745272 +0000 UTC m=+742.137512489" lastFinishedPulling="2026-03-08 03:25:45.846875596 +0000 UTC m=+747.925642813" observedRunningTime="2026-03-08 03:25:52.310580597 +0000 UTC m=+754.389347814" watchObservedRunningTime="2026-03-08 03:25:52.313836087 +0000 UTC m=+754.392603304" Mar 08 03:25:52.353866 master-0 kubenswrapper[13046]: I0308 03:25:52.353775 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-77897b758b-4ff46" podStartSLOduration=5.807898418 podStartE2EDuration="11.353752574s" podCreationTimestamp="2026-03-08 03:25:41 +0000 UTC" firstStartedPulling="2026-03-08 03:25:44.045751954 +0000 UTC m=+746.124519171" lastFinishedPulling="2026-03-08 03:25:49.59160612 +0000 UTC m=+751.670373327" observedRunningTime="2026-03-08 03:25:52.353215629 +0000 UTC m=+754.431982846" watchObservedRunningTime="2026-03-08 03:25:52.353752574 +0000 UTC m=+754.432519801" Mar 08 03:25:52.384982 master-0 kubenswrapper[13046]: I0308 03:25:52.384821 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=5.384795205 podStartE2EDuration="5.384795205s" podCreationTimestamp="2026-03-08 03:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:52.377533273 +0000 UTC m=+754.456300480" watchObservedRunningTime="2026-03-08 03:25:52.384795205 +0000 UTC m=+754.463562422" Mar 08 03:25:52.613982 master-0 kubenswrapper[13046]: I0308 03:25:52.613743 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:25:52.614199 master-0 kubenswrapper[13046]: I0308 03:25:52.614029 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" containerID="cri-o://b49e315423dd470e6b6da770027936f226df443789484303d1380c81a8337f72" gracePeriod=30 Mar 08 03:25:52.645592 master-0 kubenswrapper[13046]: I0308 03:25:52.644528 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:25:52.645592 master-0 kubenswrapper[13046]: I0308 03:25:52.644742 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" containerID="cri-o://b182ce771137b026d98ba2b5690c6edd71dd016caec91bd289049a210a43602b" gracePeriod=30 Mar 08 03:25:53.054275 master-0 kubenswrapper[13046]: I0308 03:25:53.054153 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:25:53.082121 master-0 kubenswrapper[13046]: I0308 03:25:53.082056 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: E0308 03:25:53.082888 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="kube-rbac-proxy" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: I0308 03:25:53.082912 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="kube-rbac-proxy" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: E0308 03:25:53.082941 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="multus-admission-controller" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: I0308 03:25:53.082951 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="multus-admission-controller" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: I0308 03:25:53.083247 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="kube-rbac-proxy" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: I0308 03:25:53.083280 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b66415-df37-4015-9a0c-69115b3a0739" containerName="multus-admission-controller" Mar 08 03:25:53.088906 master-0 kubenswrapper[13046]: I0308 03:25:53.084153 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.094433 master-0 kubenswrapper[13046]: I0308 03:25:53.094305 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174765 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174845 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599cs\" (UniqueName: \"kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174909 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174926 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174954 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.174978 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.175007 master-0 kubenswrapper[13046]: I0308 03:25:53.175014 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.277963 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599cs\" (UniqueName: \"kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278029 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278045 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278065 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278086 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278106 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.278165 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.282551 master-0 kubenswrapper[13046]: I0308 03:25:53.279061 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.288505 master-0 kubenswrapper[13046]: I0308 03:25:53.284147 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.288505 master-0 kubenswrapper[13046]: I0308 03:25:53.284546 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.288505 master-0 kubenswrapper[13046]: I0308 03:25:53.284771 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.288505 master-0 kubenswrapper[13046]: I0308 03:25:53.285364 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.288859 master-0 kubenswrapper[13046]: I0308 03:25:53.288826 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.302017 master-0 kubenswrapper[13046]: I0308 03:25:53.301960 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599cs\" (UniqueName: \"kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs\") pod \"console-7977cd7c97-8tssk\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.310430 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6fbc9556d8-l758n_9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/route-controller-manager/1.log" Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.310477 13046 generic.go:334] "Generic (PLEG): container finished" podID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerID="b182ce771137b026d98ba2b5690c6edd71dd016caec91bd289049a210a43602b" exitCode=0 Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.310544 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerDied","Data":"b182ce771137b026d98ba2b5690c6edd71dd016caec91bd289049a210a43602b"} Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.310577 13046 scope.go:117] "RemoveContainer" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.312095 13046 generic.go:334] "Generic (PLEG): container finished" podID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerID="b49e315423dd470e6b6da770027936f226df443789484303d1380c81a8337f72" exitCode=0 Mar 08 03:25:53.313547 master-0 kubenswrapper[13046]: I0308 03:25:53.312697 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerDied","Data":"b49e315423dd470e6b6da770027936f226df443789484303d1380c81a8337f72"} Mar 08 03:25:53.421594 master-0 kubenswrapper[13046]: I0308 03:25:53.419543 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:25:53.514052 master-0 kubenswrapper[13046]: I0308 03:25:53.513997 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:53.515176 master-0 kubenswrapper[13046]: I0308 03:25:53.515145 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:25:53.515234 master-0 kubenswrapper[13046]: I0308 03:25:53.515185 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:25:53.515416 master-0 kubenswrapper[13046]: I0308 03:25:53.515388 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:25:54.075547 master-0 kubenswrapper[13046]: I0308 03:25:54.075498 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:25:54.082982 master-0 kubenswrapper[13046]: I0308 03:25:54.082956 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:25:54.097021 master-0 kubenswrapper[13046]: I0308 03:25:54.096977 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") pod \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " Mar 08 03:25:54.097223 master-0 kubenswrapper[13046]: I0308 03:25:54.097200 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") pod \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " Mar 08 03:25:54.097295 master-0 kubenswrapper[13046]: I0308 03:25:54.097276 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") pod \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " Mar 08 03:25:54.097358 master-0 kubenswrapper[13046]: I0308 03:25:54.097334 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") pod \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " Mar 08 03:25:54.097394 master-0 kubenswrapper[13046]: I0308 03:25:54.097380 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") pod \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\" (UID: \"6bd07fa0-00f3-4267-b64a-1e7c02fdf148\") " Mar 08 03:25:54.098702 master-0 kubenswrapper[13046]: I0308 03:25:54.098667 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config" (OuterVolumeSpecName: "config") pod "6bd07fa0-00f3-4267-b64a-1e7c02fdf148" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:54.099443 master-0 kubenswrapper[13046]: I0308 03:25:54.099403 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca" (OuterVolumeSpecName: "client-ca") pod "6bd07fa0-00f3-4267-b64a-1e7c02fdf148" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:54.099874 master-0 kubenswrapper[13046]: I0308 03:25:54.099847 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6bd07fa0-00f3-4267-b64a-1e7c02fdf148" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:54.103704 master-0 kubenswrapper[13046]: I0308 03:25:54.103675 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7" (OuterVolumeSpecName: "kube-api-access-h8fg7") pod "6bd07fa0-00f3-4267-b64a-1e7c02fdf148" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148"). InnerVolumeSpecName "kube-api-access-h8fg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:54.106065 master-0 kubenswrapper[13046]: I0308 03:25:54.106016 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6bd07fa0-00f3-4267-b64a-1e7c02fdf148" (UID: "6bd07fa0-00f3-4267-b64a-1e7c02fdf148"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.199082 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") pod \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200290 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") pod \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200446 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") pod \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200470 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") pod \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\" (UID: \"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5\") " Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200807 13046 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200822 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8fg7\" (UniqueName: \"kubernetes.io/projected/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-kube-api-access-h8fg7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200838 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200869 13046 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.200880 13046 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6bd07fa0-00f3-4267-b64a-1e7c02fdf148-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.201390 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:54.203616 master-0 kubenswrapper[13046]: I0308 03:25:54.202031 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config" (OuterVolumeSpecName: "config") pod "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:25:54.206767 master-0 kubenswrapper[13046]: I0308 03:25:54.206718 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:25:54.207202 master-0 kubenswrapper[13046]: I0308 03:25:54.206904 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv" (OuterVolumeSpecName: "kube-api-access-wk7jv") pod "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" (UID: "9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5"). InnerVolumeSpecName "kube-api-access-wk7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:25:54.214666 master-0 kubenswrapper[13046]: I0308 03:25:54.214462 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bfb6b945f-6ct75"] Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: E0308 03:25:54.215654 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: I0308 03:25:54.215671 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: E0308 03:25:54.215686 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: I0308 03:25:54.215718 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: E0308 03:25:54.215746 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: I0308 03:25:54.215752 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: E0308 03:25:54.215762 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.215862 master-0 kubenswrapper[13046]: I0308 03:25:54.215768 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.217180 master-0 kubenswrapper[13046]: I0308 03:25:54.216433 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.217180 master-0 kubenswrapper[13046]: I0308 03:25:54.216468 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.217180 master-0 kubenswrapper[13046]: I0308 03:25:54.216558 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.217180 master-0 kubenswrapper[13046]: I0308 03:25:54.216604 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:54.217180 master-0 kubenswrapper[13046]: I0308 03:25:54.216620 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" Mar 08 03:25:54.217331 master-0 kubenswrapper[13046]: I0308 03:25:54.217204 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.230008 master-0 kubenswrapper[13046]: I0308 03:25:54.229692 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bfb6b945f-6ct75"] Mar 08 03:25:54.302750 master-0 kubenswrapper[13046]: I0308 03:25:54.302606 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750ef585-1b42-40ec-a20d-b186ac4c82aa-serving-cert\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.302750 master-0 kubenswrapper[13046]: I0308 03:25:54.302714 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-client-ca\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.302762 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9pfp\" (UniqueName: \"kubernetes.io/projected/750ef585-1b42-40ec-a20d-b186ac4c82aa-kube-api-access-l9pfp\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.302829 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-config\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.302889 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-proxy-ca-bundles\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.303067 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.303080 13046 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.303090 13046 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.303217 master-0 kubenswrapper[13046]: I0308 03:25:54.303099 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk7jv\" (UniqueName: \"kubernetes.io/projected/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5-kube-api-access-wk7jv\") on node \"master-0\" DevicePath \"\"" Mar 08 03:25:54.322765 master-0 kubenswrapper[13046]: I0308 03:25:54.322729 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" Mar 08 03:25:54.322925 master-0 kubenswrapper[13046]: I0308 03:25:54.322725 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" event={"ID":"6bd07fa0-00f3-4267-b64a-1e7c02fdf148","Type":"ContainerDied","Data":"0f7cf2e7c5274f939368432574755e49e40aa59b7ea66be91ed8d72957e680b6"} Mar 08 03:25:54.330996 master-0 kubenswrapper[13046]: I0308 03:25:54.330959 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" event={"ID":"9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5","Type":"ContainerDied","Data":"9c380d8376b93cf0d471da9a093b8dab4577d756ac31e0b75746f35b913cbd11"} Mar 08 03:25:54.331145 master-0 kubenswrapper[13046]: I0308 03:25:54.331005 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n" Mar 08 03:25:54.351720 master-0 kubenswrapper[13046]: I0308 03:25:54.351680 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:25:54.365497 master-0 kubenswrapper[13046]: I0308 03:25:54.365437 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6494b94d74-kwkcq"] Mar 08 03:25:54.373844 master-0 kubenswrapper[13046]: I0308 03:25:54.373773 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:25:54.377881 master-0 kubenswrapper[13046]: I0308 03:25:54.377843 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fbc9556d8-l758n"] Mar 08 03:25:54.404339 master-0 kubenswrapper[13046]: I0308 03:25:54.404291 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-config\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.404457 master-0 kubenswrapper[13046]: I0308 03:25:54.404377 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-proxy-ca-bundles\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.405084 master-0 kubenswrapper[13046]: I0308 03:25:54.405044 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750ef585-1b42-40ec-a20d-b186ac4c82aa-serving-cert\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.405145 master-0 kubenswrapper[13046]: I0308 03:25:54.405118 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-client-ca\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.405192 master-0 kubenswrapper[13046]: I0308 03:25:54.405148 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9pfp\" (UniqueName: \"kubernetes.io/projected/750ef585-1b42-40ec-a20d-b186ac4c82aa-kube-api-access-l9pfp\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.405912 master-0 kubenswrapper[13046]: I0308 03:25:54.405878 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-config\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.405979 master-0 kubenswrapper[13046]: I0308 03:25:54.405882 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-client-ca\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.407147 master-0 kubenswrapper[13046]: I0308 03:25:54.407113 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750ef585-1b42-40ec-a20d-b186ac4c82aa-proxy-ca-bundles\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.408421 master-0 kubenswrapper[13046]: I0308 03:25:54.408392 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/750ef585-1b42-40ec-a20d-b186ac4c82aa-serving-cert\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.423869 master-0 kubenswrapper[13046]: I0308 03:25:54.423838 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9pfp\" (UniqueName: \"kubernetes.io/projected/750ef585-1b42-40ec-a20d-b186ac4c82aa-kube-api-access-l9pfp\") pod \"controller-manager-5bfb6b945f-6ct75\" (UID: \"750ef585-1b42-40ec-a20d-b186ac4c82aa\") " pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.550590 master-0 kubenswrapper[13046]: I0308 03:25:54.550452 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:54.723826 master-0 kubenswrapper[13046]: I0308 03:25:54.723779 13046 scope.go:117] "RemoveContainer" containerID="1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244" Mar 08 03:25:54.767623 master-0 kubenswrapper[13046]: I0308 03:25:54.767597 13046 scope.go:117] "RemoveContainer" containerID="b49e315423dd470e6b6da770027936f226df443789484303d1380c81a8337f72" Mar 08 03:25:54.785666 master-0 kubenswrapper[13046]: I0308 03:25:54.785630 13046 scope.go:117] "RemoveContainer" containerID="1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244" Mar 08 03:25:54.786059 master-0 kubenswrapper[13046]: E0308 03:25:54.786018 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244\": container with ID starting with 1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244 not found: ID does not exist" containerID="1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244" Mar 08 03:25:54.786106 master-0 kubenswrapper[13046]: I0308 03:25:54.786062 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244"} err="failed to get container status \"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244\": rpc error: code = NotFound desc = could not find container \"1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244\": container with ID starting with 1341742f26056f85141655bd1472688511b0d1e1ae0a8f484a5a4035e80a4244 not found: ID does not exist" Mar 08 03:25:54.786106 master-0 kubenswrapper[13046]: I0308 03:25:54.786091 13046 scope.go:117] "RemoveContainer" containerID="b182ce771137b026d98ba2b5690c6edd71dd016caec91bd289049a210a43602b" Mar 08 03:25:54.816092 master-0 kubenswrapper[13046]: I0308 03:25:54.816059 13046 scope.go:117] "RemoveContainer" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:25:54.816455 master-0 kubenswrapper[13046]: E0308 03:25:54.816393 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6\": container with ID starting with 1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6 not found: ID does not exist" containerID="1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6" Mar 08 03:25:54.816455 master-0 kubenswrapper[13046]: I0308 03:25:54.816428 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6"} err="failed to get container status \"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6\": rpc error: code = NotFound desc = could not find container \"1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6\": container with ID starting with 1a698fe65bdacae5984069cbf9d2eeb3ffe940ed272ec908c59a6046a6c0e8e6 not found: ID does not exist" Mar 08 03:25:55.153881 master-0 kubenswrapper[13046]: I0308 03:25:55.153397 13046 patch_prober.go:28] interesting pod/controller-manager-6494b94d74-kwkcq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:25:55.153881 master-0 kubenswrapper[13046]: I0308 03:25:55.153478 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6494b94d74-kwkcq" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.70:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:25:55.156533 master-0 kubenswrapper[13046]: I0308 03:25:55.154229 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:25:55.341037 master-0 kubenswrapper[13046]: I0308 03:25:55.340997 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"2b7ba9c225b9dfbfcbb8eda6690a82847a7993a619a2d917eb7936bdd4191189"} Mar 08 03:25:55.377545 master-0 kubenswrapper[13046]: I0308 03:25:55.373930 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" podStartSLOduration=7.110180416 podStartE2EDuration="12.373908971s" podCreationTimestamp="2026-03-08 03:25:43 +0000 UTC" firstStartedPulling="2026-03-08 03:25:49.503963531 +0000 UTC m=+751.582730748" lastFinishedPulling="2026-03-08 03:25:54.767692086 +0000 UTC m=+756.846459303" observedRunningTime="2026-03-08 03:25:55.368412679 +0000 UTC m=+757.447179906" watchObservedRunningTime="2026-03-08 03:25:55.373908971 +0000 UTC m=+757.452676198" Mar 08 03:25:55.631814 master-0 kubenswrapper[13046]: I0308 03:25:55.631582 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bfb6b945f-6ct75"] Mar 08 03:25:55.693946 master-0 kubenswrapper[13046]: I0308 03:25:55.692684 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:25:56.127789 master-0 kubenswrapper[13046]: I0308 03:25:56.127700 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd07fa0-00f3-4267-b64a-1e7c02fdf148" path="/var/lib/kubelet/pods/6bd07fa0-00f3-4267-b64a-1e7c02fdf148/volumes" Mar 08 03:25:56.128591 master-0 kubenswrapper[13046]: I0308 03:25:56.128554 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" path="/var/lib/kubelet/pods/9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5/volumes" Mar 08 03:25:56.227554 master-0 kubenswrapper[13046]: I0308 03:25:56.227469 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz"] Mar 08 03:25:56.227845 master-0 kubenswrapper[13046]: E0308 03:25:56.227826 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:56.227888 master-0 kubenswrapper[13046]: I0308 03:25:56.227848 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c61d80b-5ca1-40d7-adb2-fb01a4b0d9d5" containerName="route-controller-manager" Mar 08 03:25:56.229777 master-0 kubenswrapper[13046]: I0308 03:25:56.229753 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.233936 master-0 kubenswrapper[13046]: I0308 03:25:56.233068 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:25:56.233936 master-0 kubenswrapper[13046]: I0308 03:25:56.233439 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:25:56.233936 master-0 kubenswrapper[13046]: I0308 03:25:56.233624 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-xfphx" Mar 08 03:25:56.233936 master-0 kubenswrapper[13046]: I0308 03:25:56.233855 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:25:56.234267 master-0 kubenswrapper[13046]: I0308 03:25:56.234227 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:25:56.234517 master-0 kubenswrapper[13046]: I0308 03:25:56.234469 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:25:56.234942 master-0 kubenswrapper[13046]: I0308 03:25:56.234901 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz"] Mar 08 03:25:56.359361 master-0 kubenswrapper[13046]: I0308 03:25:56.359309 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" event={"ID":"1893fc9c-7c29-4674-8011-f046dd63a08b","Type":"ContainerStarted","Data":"8dd2c8f881f95f59c8e57e1c88d72df8d334dceb5c7b0b400d78c31ba5940708"} Mar 08 03:25:56.363779 master-0 kubenswrapper[13046]: I0308 03:25:56.363716 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0"} Mar 08 03:25:56.382262 master-0 kubenswrapper[13046]: I0308 03:25:56.382087 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-config\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.382262 master-0 kubenswrapper[13046]: I0308 03:25:56.382212 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9574429d-2df9-4023-8a71-9bf2bdb27b7f-serving-cert\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.382525 master-0 kubenswrapper[13046]: I0308 03:25:56.382241 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62fcw\" (UniqueName: \"kubernetes.io/projected/9574429d-2df9-4023-8a71-9bf2bdb27b7f-kube-api-access-62fcw\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.382525 master-0 kubenswrapper[13046]: I0308 03:25:56.382348 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-client-ca\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.483888 master-0 kubenswrapper[13046]: I0308 03:25:56.483817 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-config\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.484138 master-0 kubenswrapper[13046]: I0308 03:25:56.483945 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9574429d-2df9-4023-8a71-9bf2bdb27b7f-serving-cert\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.484138 master-0 kubenswrapper[13046]: I0308 03:25:56.483972 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62fcw\" (UniqueName: \"kubernetes.io/projected/9574429d-2df9-4023-8a71-9bf2bdb27b7f-kube-api-access-62fcw\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.484138 master-0 kubenswrapper[13046]: I0308 03:25:56.484027 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-client-ca\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.488633 master-0 kubenswrapper[13046]: I0308 03:25:56.487897 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-config\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.488633 master-0 kubenswrapper[13046]: I0308 03:25:56.488464 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9574429d-2df9-4023-8a71-9bf2bdb27b7f-client-ca\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.500084 master-0 kubenswrapper[13046]: I0308 03:25:56.500015 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9574429d-2df9-4023-8a71-9bf2bdb27b7f-serving-cert\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.509610 master-0 kubenswrapper[13046]: I0308 03:25:56.509554 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62fcw\" (UniqueName: \"kubernetes.io/projected/9574429d-2df9-4023-8a71-9bf2bdb27b7f-kube-api-access-62fcw\") pod \"route-controller-manager-5496fd685c-2j2pz\" (UID: \"9574429d-2df9-4023-8a71-9bf2bdb27b7f\") " pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.563545 master-0 kubenswrapper[13046]: I0308 03:25:56.563403 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:56.680378 master-0 kubenswrapper[13046]: W0308 03:25:56.680334 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod750ef585_1b42_40ec_a20d_b186ac4c82aa.slice/crio-0faa1c6e5e71658364d93e5378bcf408c898571d4ebe560d9d630d8fa15a734e WatchSource:0}: Error finding container 0faa1c6e5e71658364d93e5378bcf408c898571d4ebe560d9d630d8fa15a734e: Status 404 returned error can't find the container with id 0faa1c6e5e71658364d93e5378bcf408c898571d4ebe560d9d630d8fa15a734e Mar 08 03:25:56.693581 master-0 kubenswrapper[13046]: W0308 03:25:56.693533 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16bab511_b73b_4215_824b_641d45e7987b.slice/crio-67bf768e562b8de1c76885a3794a14013dfc5d68445bd6ce75388a56d49f7f17 WatchSource:0}: Error finding container 67bf768e562b8de1c76885a3794a14013dfc5d68445bd6ce75388a56d49f7f17: Status 404 returned error can't find the container with id 67bf768e562b8de1c76885a3794a14013dfc5d68445bd6ce75388a56d49f7f17 Mar 08 03:25:57.162827 master-0 kubenswrapper[13046]: I0308 03:25:57.162212 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz"] Mar 08 03:25:57.408437 master-0 kubenswrapper[13046]: I0308 03:25:57.408385 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7977cd7c97-8tssk" event={"ID":"16bab511-b73b-4215-824b-641d45e7987b","Type":"ContainerStarted","Data":"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666"} Mar 08 03:25:57.408796 master-0 kubenswrapper[13046]: I0308 03:25:57.408447 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7977cd7c97-8tssk" event={"ID":"16bab511-b73b-4215-824b-641d45e7987b","Type":"ContainerStarted","Data":"67bf768e562b8de1c76885a3794a14013dfc5d68445bd6ce75388a56d49f7f17"} Mar 08 03:25:57.449523 master-0 kubenswrapper[13046]: I0308 03:25:57.448355 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7977cd7c97-8tssk" podStartSLOduration=4.448336961 podStartE2EDuration="4.448336961s" podCreationTimestamp="2026-03-08 03:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:57.447771445 +0000 UTC m=+759.526538662" watchObservedRunningTime="2026-03-08 03:25:57.448336961 +0000 UTC m=+759.527104178" Mar 08 03:25:57.463563 master-0 kubenswrapper[13046]: I0308 03:25:57.460412 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb"} Mar 08 03:25:57.463563 master-0 kubenswrapper[13046]: I0308 03:25:57.460457 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392"} Mar 08 03:25:57.463563 master-0 kubenswrapper[13046]: I0308 03:25:57.460466 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301"} Mar 08 03:25:57.483769 master-0 kubenswrapper[13046]: I0308 03:25:57.483736 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} Mar 08 03:25:57.483888 master-0 kubenswrapper[13046]: I0308 03:25:57.483871 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} Mar 08 03:25:57.483963 master-0 kubenswrapper[13046]: I0308 03:25:57.483949 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} Mar 08 03:25:57.485662 master-0 kubenswrapper[13046]: I0308 03:25:57.485637 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"59bb2ecdc5abcc11c267f51c4b0423fb079ef7200836df809b5b2def471e757a"} Mar 08 03:25:57.485786 master-0 kubenswrapper[13046]: I0308 03:25:57.485768 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"a58c9a85d23eef03bab1ed4d3d9e559a46537827e27092dbb3b7bb8a09818917"} Mar 08 03:25:57.487064 master-0 kubenswrapper[13046]: I0308 03:25:57.487045 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" event={"ID":"9574429d-2df9-4023-8a71-9bf2bdb27b7f","Type":"ContainerStarted","Data":"986365855f926040ea9f88bcb16eed1bf7dae018a8694fdd053db64144d7e7c6"} Mar 08 03:25:57.487154 master-0 kubenswrapper[13046]: I0308 03:25:57.487141 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" event={"ID":"9574429d-2df9-4023-8a71-9bf2bdb27b7f","Type":"ContainerStarted","Data":"474e787276cde10b05eec5c70653b563020ba8370967f3760221e8e5e22a67c2"} Mar 08 03:25:57.487255 master-0 kubenswrapper[13046]: I0308 03:25:57.487243 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:57.489343 master-0 kubenswrapper[13046]: I0308 03:25:57.489297 13046 patch_prober.go:28] interesting pod/route-controller-manager-5496fd685c-2j2pz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.105:8443/healthz\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 08 03:25:57.489407 master-0 kubenswrapper[13046]: I0308 03:25:57.489340 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" podUID="9574429d-2df9-4023-8a71-9bf2bdb27b7f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.105:8443/healthz\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 08 03:25:57.493103 master-0 kubenswrapper[13046]: I0308 03:25:57.493013 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" event={"ID":"750ef585-1b42-40ec-a20d-b186ac4c82aa","Type":"ContainerStarted","Data":"5895e7d72217a4ea5906975c5e6438a553df56eec2dd482a882c2f145b20dabb"} Mar 08 03:25:57.493103 master-0 kubenswrapper[13046]: I0308 03:25:57.493056 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" event={"ID":"750ef585-1b42-40ec-a20d-b186ac4c82aa","Type":"ContainerStarted","Data":"0faa1c6e5e71658364d93e5378bcf408c898571d4ebe560d9d630d8fa15a734e"} Mar 08 03:25:57.495253 master-0 kubenswrapper[13046]: I0308 03:25:57.494720 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:57.501473 master-0 kubenswrapper[13046]: I0308 03:25:57.501304 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" Mar 08 03:25:57.515926 master-0 kubenswrapper[13046]: I0308 03:25:57.515865 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" podStartSLOduration=5.5158501730000005 podStartE2EDuration="5.515850173s" podCreationTimestamp="2026-03-08 03:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:57.509821435 +0000 UTC m=+759.588588652" watchObservedRunningTime="2026-03-08 03:25:57.515850173 +0000 UTC m=+759.594617380" Mar 08 03:25:57.537505 master-0 kubenswrapper[13046]: I0308 03:25:57.535308 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bfb6b945f-6ct75" podStartSLOduration=5.535289721 podStartE2EDuration="5.535289721s" podCreationTimestamp="2026-03-08 03:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:25:57.534765127 +0000 UTC m=+759.613532344" watchObservedRunningTime="2026-03-08 03:25:57.535289721 +0000 UTC m=+759.614056938" Mar 08 03:25:58.513748 master-0 kubenswrapper[13046]: I0308 03:25:58.513679 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b"} Mar 08 03:25:58.529566 master-0 kubenswrapper[13046]: I0308 03:25:58.528130 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} Mar 08 03:25:58.530063 master-0 kubenswrapper[13046]: I0308 03:25:58.529592 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} Mar 08 03:25:58.530063 master-0 kubenswrapper[13046]: I0308 03:25:58.529630 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerStarted","Data":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} Mar 08 03:25:58.536243 master-0 kubenswrapper[13046]: I0308 03:25:58.535749 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5496fd685c-2j2pz" Mar 08 03:25:58.579745 master-0 kubenswrapper[13046]: I0308 03:25:58.579456 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=8.036429424 podStartE2EDuration="12.579433109s" podCreationTimestamp="2026-03-08 03:25:46 +0000 UTC" firstStartedPulling="2026-03-08 03:25:52.236427812 +0000 UTC m=+754.315195029" lastFinishedPulling="2026-03-08 03:25:56.779431497 +0000 UTC m=+758.858198714" observedRunningTime="2026-03-08 03:25:58.578583145 +0000 UTC m=+760.657350392" watchObservedRunningTime="2026-03-08 03:25:58.579433109 +0000 UTC m=+760.658200346" Mar 08 03:25:58.783275 master-0 kubenswrapper[13046]: I0308 03:25:58.783220 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:25:59.543528 master-0 kubenswrapper[13046]: I0308 03:25:59.543406 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"53e9b996ace4992cf5cbcf3488032cf202eadb6ab2763b4cde83f4ba45aa9f85"} Mar 08 03:25:59.543528 master-0 kubenswrapper[13046]: I0308 03:25:59.543469 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"c48b69887377a26cb62d6e874bab02c03c2df59b482cbcf1f449bdf22cbafcc2"} Mar 08 03:25:59.543528 master-0 kubenswrapper[13046]: I0308 03:25:59.543511 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" event={"ID":"b3311dbb-cd30-4cd9-9f18-d360521bec39","Type":"ContainerStarted","Data":"11db85fe48f7a0ac7f8d96265d7a68220025d397ae122716913be606eb53c27f"} Mar 08 03:25:59.544712 master-0 kubenswrapper[13046]: I0308 03:25:59.543834 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:25:59.547468 master-0 kubenswrapper[13046]: I0308 03:25:59.547395 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerStarted","Data":"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15"} Mar 08 03:25:59.578828 master-0 kubenswrapper[13046]: I0308 03:25:59.578670 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" podStartSLOduration=6.705922919 podStartE2EDuration="16.578649599s" podCreationTimestamp="2026-03-08 03:25:43 +0000 UTC" firstStartedPulling="2026-03-08 03:25:48.522257445 +0000 UTC m=+750.601024662" lastFinishedPulling="2026-03-08 03:25:58.394984125 +0000 UTC m=+760.473751342" observedRunningTime="2026-03-08 03:25:59.572833668 +0000 UTC m=+761.651600885" watchObservedRunningTime="2026-03-08 03:25:59.578649599 +0000 UTC m=+761.657416816" Mar 08 03:25:59.632341 master-0 kubenswrapper[13046]: I0308 03:25:59.632264 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.537561271 podStartE2EDuration="19.632243575s" podCreationTimestamp="2026-03-08 03:25:40 +0000 UTC" firstStartedPulling="2026-03-08 03:25:41.308141248 +0000 UTC m=+743.386908465" lastFinishedPulling="2026-03-08 03:25:58.402823562 +0000 UTC m=+760.481590769" observedRunningTime="2026-03-08 03:25:59.623116452 +0000 UTC m=+761.701883689" watchObservedRunningTime="2026-03-08 03:25:59.632243575 +0000 UTC m=+761.711010792" Mar 08 03:26:00.149048 master-0 kubenswrapper[13046]: I0308 03:26:00.148982 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:26:00.149414 master-0 kubenswrapper[13046]: I0308 03:26:00.149297 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" containerID="cri-o://85d08f576755b1f4982d207073aca843243efd692daf52094875911a38bb8b2f" gracePeriod=30 Mar 08 03:26:00.149502 master-0 kubenswrapper[13046]: I0308 03:26:00.149471 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" containerID="cri-o://b162d8349d83460cb664c5872e401282175cb86df3f0012fb7fce29a941e6bca" gracePeriod=30 Mar 08 03:26:00.149548 master-0 kubenswrapper[13046]: I0308 03:26:00.149537 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" containerID="cri-o://35834a4aa2c1aafe2e80cffe71b4934a09d612026d02ceb8d478e6578d08c89b" gracePeriod=30 Mar 08 03:26:00.150143 master-0 kubenswrapper[13046]: I0308 03:26:00.150095 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: E0308 03:26:00.150502 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150518 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: E0308 03:26:00.150672 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150683 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: E0308 03:26:00.150693 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150701 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: E0308 03:26:00.150748 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150758 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150922 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150946 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 08 03:26:00.151390 master-0 kubenswrapper[13046]: I0308 03:26:00.150978 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 08 03:26:00.278551 master-0 kubenswrapper[13046]: I0308 03:26:00.278430 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.278674 master-0 kubenswrapper[13046]: I0308 03:26:00.278566 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.335607 master-0 kubenswrapper[13046]: I0308 03:26:00.335567 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 08 03:26:00.337418 master-0 kubenswrapper[13046]: I0308 03:26:00.337393 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.346935 master-0 kubenswrapper[13046]: I0308 03:26:00.346073 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 08 03:26:00.380201 master-0 kubenswrapper[13046]: I0308 03:26:00.380153 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.380298 master-0 kubenswrapper[13046]: I0308 03:26:00.380208 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.380456 master-0 kubenswrapper[13046]: I0308 03:26:00.380355 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.380456 master-0 kubenswrapper[13046]: I0308 03:26:00.380429 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.482175 master-0 kubenswrapper[13046]: I0308 03:26:00.482028 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 08 03:26:00.482346 master-0 kubenswrapper[13046]: I0308 03:26:00.482257 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:00.482400 master-0 kubenswrapper[13046]: I0308 03:26:00.482379 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 08 03:26:00.482603 master-0 kubenswrapper[13046]: I0308 03:26:00.482560 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:00.483024 master-0 kubenswrapper[13046]: I0308 03:26:00.482988 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:00.483024 master-0 kubenswrapper[13046]: I0308 03:26:00.483014 13046 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:00.561011 master-0 kubenswrapper[13046]: I0308 03:26:00.560951 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 08 03:26:00.562088 master-0 kubenswrapper[13046]: I0308 03:26:00.561969 13046 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="b162d8349d83460cb664c5872e401282175cb86df3f0012fb7fce29a941e6bca" exitCode=0 Mar 08 03:26:00.562088 master-0 kubenswrapper[13046]: I0308 03:26:00.562028 13046 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="35834a4aa2c1aafe2e80cffe71b4934a09d612026d02ceb8d478e6578d08c89b" exitCode=2 Mar 08 03:26:00.562418 master-0 kubenswrapper[13046]: I0308 03:26:00.562258 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:00.562548 master-0 kubenswrapper[13046]: I0308 03:26:00.562049 13046 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="85d08f576755b1f4982d207073aca843243efd692daf52094875911a38bb8b2f" exitCode=0 Mar 08 03:26:00.562708 master-0 kubenswrapper[13046]: I0308 03:26:00.562611 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2555f1b92822f2870135919975dbd486303fecca78d27947d144c55e2df2020c" Mar 08 03:26:00.564721 master-0 kubenswrapper[13046]: I0308 03:26:00.564651 13046 generic.go:334] "Generic (PLEG): container finished" podID="e8d88f12-2fa2-4f01-badf-3543770a14f1" containerID="c039518268c4e5b626b34db008084a8b51e0328a3e98352d92c8cc40d2d679f2" exitCode=0 Mar 08 03:26:00.564987 master-0 kubenswrapper[13046]: I0308 03:26:00.564716 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"e8d88f12-2fa2-4f01-badf-3543770a14f1","Type":"ContainerDied","Data":"c039518268c4e5b626b34db008084a8b51e0328a3e98352d92c8cc40d2d679f2"} Mar 08 03:26:00.566415 master-0 kubenswrapper[13046]: I0308 03:26:00.566363 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 08 03:26:00.609277 master-0 kubenswrapper[13046]: I0308 03:26:00.608671 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 08 03:26:01.610272 master-0 kubenswrapper[13046]: I0308 03:26:01.610192 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:26:02.126520 master-0 kubenswrapper[13046]: I0308 03:26:02.126470 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3d45b6ce1b3764f9927e623a71adf8" path="/var/lib/kubelet/pods/1d3d45b6ce1b3764f9927e623a71adf8/volumes" Mar 08 03:26:03.420261 master-0 kubenswrapper[13046]: I0308 03:26:03.420159 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:26:03.420261 master-0 kubenswrapper[13046]: I0308 03:26:03.420240 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:26:03.422354 master-0 kubenswrapper[13046]: I0308 03:26:03.422302 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:03.422433 master-0 kubenswrapper[13046]: I0308 03:26:03.422373 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:03.513621 master-0 kubenswrapper[13046]: I0308 03:26:03.513576 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:03.513834 master-0 kubenswrapper[13046]: I0308 03:26:03.513639 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:26:03.559248 master-0 kubenswrapper[13046]: I0308 03:26:03.559191 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:26:03.560664 master-0 kubenswrapper[13046]: I0308 03:26:03.560626 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:26:03.588507 master-0 kubenswrapper[13046]: I0308 03:26:03.588075 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:26:03.588507 master-0 kubenswrapper[13046]: I0308 03:26:03.588352 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" containerName="installer" containerID="cri-o://61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24" gracePeriod=30 Mar 08 03:26:03.617922 master-0 kubenswrapper[13046]: I0308 03:26:03.612603 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-688d9d49f9-nnqwz" Mar 08 03:26:06.586851 master-0 kubenswrapper[13046]: I0308 03:26:06.586764 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:26:06.587909 master-0 kubenswrapper[13046]: I0308 03:26:06.587598 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.602463 master-0 kubenswrapper[13046]: I0308 03:26:06.599816 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:26:06.620397 master-0 kubenswrapper[13046]: I0308 03:26:06.620047 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.620397 master-0 kubenswrapper[13046]: I0308 03:26:06.620085 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.620397 master-0 kubenswrapper[13046]: I0308 03:26:06.620194 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.721679 master-0 kubenswrapper[13046]: I0308 03:26:06.721612 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.721935 master-0 kubenswrapper[13046]: I0308 03:26:06.721790 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.721935 master-0 kubenswrapper[13046]: I0308 03:26:06.721815 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.721935 master-0 kubenswrapper[13046]: I0308 03:26:06.721914 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.722261 master-0 kubenswrapper[13046]: I0308 03:26:06.721956 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.741883 master-0 kubenswrapper[13046]: I0308 03:26:06.741840 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:06.818672 master-0 kubenswrapper[13046]: I0308 03:26:06.818600 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" containerID="cri-o://baf31e0018548ce23470fd372aed90f69f405884c0594cc4cd64bd69dc859f86" gracePeriod=15 Mar 08 03:26:06.907711 master-0 kubenswrapper[13046]: I0308 03:26:06.907647 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:26:08.081122 master-0 kubenswrapper[13046]: I0308 03:26:08.081058 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:26:08.081710 master-0 kubenswrapper[13046]: I0308 03:26:08.081358 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager" containerID="cri-o://90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" gracePeriod=30 Mar 08 03:26:08.081710 master-0 kubenswrapper[13046]: I0308 03:26:08.081553 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" gracePeriod=30 Mar 08 03:26:08.081710 master-0 kubenswrapper[13046]: I0308 03:26:08.081622 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" gracePeriod=30 Mar 08 03:26:08.081710 master-0 kubenswrapper[13046]: I0308 03:26:08.081664 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b63b5185e5bc481b891676a634cb5625" containerName="cluster-policy-controller" containerID="cri-o://376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" gracePeriod=30 Mar 08 03:26:08.084292 master-0 kubenswrapper[13046]: I0308 03:26:08.084259 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:26:08.084587 master-0 kubenswrapper[13046]: E0308 03:26:08.084570 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager" Mar 08 03:26:08.084587 master-0 kubenswrapper[13046]: I0308 03:26:08.084587 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: E0308 03:26:08.084606 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-recovery-controller" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: I0308 03:26:08.084614 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-recovery-controller" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: E0308 03:26:08.084623 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b5185e5bc481b891676a634cb5625" containerName="cluster-policy-controller" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: I0308 03:26:08.084629 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b5185e5bc481b891676a634cb5625" containerName="cluster-policy-controller" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: E0308 03:26:08.084656 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-cert-syncer" Mar 08 03:26:08.084694 master-0 kubenswrapper[13046]: I0308 03:26:08.084663 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-cert-syncer" Mar 08 03:26:08.084851 master-0 kubenswrapper[13046]: I0308 03:26:08.084790 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-cert-syncer" Mar 08 03:26:08.084851 master-0 kubenswrapper[13046]: I0308 03:26:08.084803 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b5185e5bc481b891676a634cb5625" containerName="cluster-policy-controller" Mar 08 03:26:08.084851 master-0 kubenswrapper[13046]: I0308 03:26:08.084821 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager-recovery-controller" Mar 08 03:26:08.084851 master-0 kubenswrapper[13046]: I0308 03:26:08.084842 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63b5185e5bc481b891676a634cb5625" containerName="kube-controller-manager" Mar 08 03:26:08.149823 master-0 kubenswrapper[13046]: I0308 03:26:08.149768 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="b63b5185e5bc481b891676a634cb5625" podUID="7c77ceff52dd4a01c709016eef561173" Mar 08 03:26:08.173288 master-0 kubenswrapper[13046]: I0308 03:26:08.171696 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.173288 master-0 kubenswrapper[13046]: I0308 03:26:08.171930 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.272830 master-0 kubenswrapper[13046]: I0308 03:26:08.272788 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.273069 master-0 kubenswrapper[13046]: I0308 03:26:08.272858 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.273069 master-0 kubenswrapper[13046]: I0308 03:26:08.272954 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.273069 master-0 kubenswrapper[13046]: I0308 03:26:08.272965 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7c77ceff52dd4a01c709016eef561173-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7c77ceff52dd4a01c709016eef561173\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:08.378702 master-0 kubenswrapper[13046]: E0308 03:26:08.378616 13046 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod6c539a17_b57b_446a_b50d_976adc8766ef.slice/crio-conmon-b615173188d3c6ef1796ab2b05330b5d0d8e92137820ea937309af44c50f2c0e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb63b5185e5bc481b891676a634cb5625.slice/crio-conmon-90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb63b5185e5bc481b891676a634cb5625.slice/crio-90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9.scope\": RecentStats: unable to find data in memory cache]" Mar 08 03:26:12.118230 master-0 kubenswrapper[13046]: I0308 03:26:12.118158 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:12.151299 master-0 kubenswrapper[13046]: I0308 03:26:12.151241 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="b6c78c18-f450-46dc-a093-933e94dfc54e" Mar 08 03:26:12.151299 master-0 kubenswrapper[13046]: I0308 03:26:12.151286 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="b6c78c18-f450-46dc-a093-933e94dfc54e" Mar 08 03:26:12.303127 master-0 kubenswrapper[13046]: I0308 03:26:12.303066 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:26:12.321539 master-0 kubenswrapper[13046]: I0308 03:26:12.321213 13046 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:12.340950 master-0 kubenswrapper[13046]: I0308 03:26:12.340868 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:26:12.491150 master-0 kubenswrapper[13046]: I0308 03:26:12.491030 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:12.493124 master-0 kubenswrapper[13046]: I0308 03:26:12.493089 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 08 03:26:13.199166 master-0 kubenswrapper[13046]: I0308 03:26:13.199113 13046 patch_prober.go:28] interesting pod/oauth-openshift-5fff964fcf-s5mbf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.90:6443/healthz\": dial tcp 10.128.0.90:6443: connect: connection refused" start-of-body= Mar 08 03:26:13.199675 master-0 kubenswrapper[13046]: I0308 03:26:13.199198 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.90:6443/healthz\": dial tcp 10.128.0.90:6443: connect: connection refused" Mar 08 03:26:13.421292 master-0 kubenswrapper[13046]: I0308 03:26:13.421228 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:13.421496 master-0 kubenswrapper[13046]: I0308 03:26:13.421311 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:13.513275 master-0 kubenswrapper[13046]: I0308 03:26:13.513126 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:13.513275 master-0 kubenswrapper[13046]: I0308 03:26:13.513203 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:26:14.860431 master-0 kubenswrapper[13046]: W0308 03:26:14.860347 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1453f6461bf5d599ad65a4656343ee91.slice/crio-5f8dcd4a3eb0d6affcdc907a67ea64046ea0ef8a401fa56d05c919396d7c2e99 WatchSource:0}: Error finding container 5f8dcd4a3eb0d6affcdc907a67ea64046ea0ef8a401fa56d05c919396d7c2e99: Status 404 returned error can't find the container with id 5f8dcd4a3eb0d6affcdc907a67ea64046ea0ef8a401fa56d05c919396d7c2e99 Mar 08 03:26:14.899418 master-0 kubenswrapper[13046]: I0308 03:26:14.899370 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:26:15.027203 master-0 kubenswrapper[13046]: I0308 03:26:15.027133 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock\") pod \"e8d88f12-2fa2-4f01-badf-3543770a14f1\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " Mar 08 03:26:15.027392 master-0 kubenswrapper[13046]: I0308 03:26:15.027280 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir\") pod \"e8d88f12-2fa2-4f01-badf-3543770a14f1\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " Mar 08 03:26:15.027638 master-0 kubenswrapper[13046]: I0308 03:26:15.027589 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock" (OuterVolumeSpecName: "var-lock") pod "e8d88f12-2fa2-4f01-badf-3543770a14f1" (UID: "e8d88f12-2fa2-4f01-badf-3543770a14f1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:15.027689 master-0 kubenswrapper[13046]: I0308 03:26:15.027654 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e8d88f12-2fa2-4f01-badf-3543770a14f1" (UID: "e8d88f12-2fa2-4f01-badf-3543770a14f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:15.028653 master-0 kubenswrapper[13046]: I0308 03:26:15.028609 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access\") pod \"e8d88f12-2fa2-4f01-badf-3543770a14f1\" (UID: \"e8d88f12-2fa2-4f01-badf-3543770a14f1\") " Mar 08 03:26:15.029113 master-0 kubenswrapper[13046]: I0308 03:26:15.029077 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.029113 master-0 kubenswrapper[13046]: I0308 03:26:15.029103 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8d88f12-2fa2-4f01-badf-3543770a14f1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.039444 master-0 kubenswrapper[13046]: I0308 03:26:15.039362 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e8d88f12-2fa2-4f01-badf-3543770a14f1" (UID: "e8d88f12-2fa2-4f01-badf-3543770a14f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:26:15.131526 master-0 kubenswrapper[13046]: I0308 03:26:15.130703 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e8d88f12-2fa2-4f01-badf-3543770a14f1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.210543 master-0 kubenswrapper[13046]: I0308 03:26:15.210506 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_b63b5185e5bc481b891676a634cb5625/kube-controller-manager-cert-syncer/0.log" Mar 08 03:26:15.211931 master-0 kubenswrapper[13046]: I0308 03:26:15.211891 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:15.333667 master-0 kubenswrapper[13046]: I0308 03:26:15.333560 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir\") pod \"b63b5185e5bc481b891676a634cb5625\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " Mar 08 03:26:15.333947 master-0 kubenswrapper[13046]: I0308 03:26:15.333752 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir\") pod \"b63b5185e5bc481b891676a634cb5625\" (UID: \"b63b5185e5bc481b891676a634cb5625\") " Mar 08 03:26:15.333947 master-0 kubenswrapper[13046]: I0308 03:26:15.333744 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b63b5185e5bc481b891676a634cb5625" (UID: "b63b5185e5bc481b891676a634cb5625"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:15.333947 master-0 kubenswrapper[13046]: I0308 03:26:15.333791 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b63b5185e5bc481b891676a634cb5625" (UID: "b63b5185e5bc481b891676a634cb5625"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:15.334337 master-0 kubenswrapper[13046]: I0308 03:26:15.334307 13046 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.334422 master-0 kubenswrapper[13046]: I0308 03:26:15.334346 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b63b5185e5bc481b891676a634cb5625-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.424886 master-0 kubenswrapper[13046]: I0308 03:26:15.424028 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="b63b5185e5bc481b891676a634cb5625" podUID="7c77ceff52dd4a01c709016eef561173" Mar 08 03:26:15.587892 master-0 kubenswrapper[13046]: I0308 03:26:15.587840 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 08 03:26:15.718925 master-0 kubenswrapper[13046]: I0308 03:26:15.718820 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"e8d88f12-2fa2-4f01-badf-3543770a14f1","Type":"ContainerDied","Data":"ea3ee0427921ab302a9830fe327e17855ab0fec493db468f66488876aa20cb24"} Mar 08 03:26:15.718925 master-0 kubenswrapper[13046]: I0308 03:26:15.718883 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3ee0427921ab302a9830fe327e17855ab0fec493db468f66488876aa20cb24" Mar 08 03:26:15.719553 master-0 kubenswrapper[13046]: I0308 03:26:15.718971 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 08 03:26:15.720547 master-0 kubenswrapper[13046]: I0308 03:26:15.720426 13046 generic.go:334] "Generic (PLEG): container finished" podID="1f559362-f339-4de3-9666-757654e9c35e" containerID="baf31e0018548ce23470fd372aed90f69f405884c0594cc4cd64bd69dc859f86" exitCode=0 Mar 08 03:26:15.720547 master-0 kubenswrapper[13046]: I0308 03:26:15.720518 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" event={"ID":"1f559362-f339-4de3-9666-757654e9c35e","Type":"ContainerDied","Data":"baf31e0018548ce23470fd372aed90f69f405884c0594cc4cd64bd69dc859f86"} Mar 08 03:26:15.722016 master-0 kubenswrapper[13046]: I0308 03:26:15.721965 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_b63b5185e5bc481b891676a634cb5625/kube-controller-manager-cert-syncer/0.log" Mar 08 03:26:15.722686 master-0 kubenswrapper[13046]: I0308 03:26:15.722642 13046 generic.go:334] "Generic (PLEG): container finished" podID="b63b5185e5bc481b891676a634cb5625" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" exitCode=0 Mar 08 03:26:15.722686 master-0 kubenswrapper[13046]: I0308 03:26:15.722665 13046 generic.go:334] "Generic (PLEG): container finished" podID="b63b5185e5bc481b891676a634cb5625" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" exitCode=2 Mar 08 03:26:15.722686 master-0 kubenswrapper[13046]: I0308 03:26:15.722675 13046 generic.go:334] "Generic (PLEG): container finished" podID="b63b5185e5bc481b891676a634cb5625" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" exitCode=0 Mar 08 03:26:15.722686 master-0 kubenswrapper[13046]: I0308 03:26:15.722685 13046 generic.go:334] "Generic (PLEG): container finished" podID="b63b5185e5bc481b891676a634cb5625" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" exitCode=0 Mar 08 03:26:15.723102 master-0 kubenswrapper[13046]: I0308 03:26:15.722747 13046 scope.go:117] "RemoveContainer" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:15.723102 master-0 kubenswrapper[13046]: I0308 03:26:15.722891 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:15.727353 master-0 kubenswrapper[13046]: I0308 03:26:15.727184 13046 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="f84796a8b5989abd5932fcb3d7b200bb0e341030e182203e1c0af65c8d3cb371" exitCode=0 Mar 08 03:26:15.727353 master-0 kubenswrapper[13046]: I0308 03:26:15.727265 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"f84796a8b5989abd5932fcb3d7b200bb0e341030e182203e1c0af65c8d3cb371"} Mar 08 03:26:15.727353 master-0 kubenswrapper[13046]: I0308 03:26:15.727353 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"5f8dcd4a3eb0d6affcdc907a67ea64046ea0ef8a401fa56d05c919396d7c2e99"} Mar 08 03:26:15.728576 master-0 kubenswrapper[13046]: I0308 03:26:15.728514 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="b63b5185e5bc481b891676a634cb5625" podUID="7c77ceff52dd4a01c709016eef561173" Mar 08 03:26:15.729669 master-0 kubenswrapper[13046]: I0308 03:26:15.729628 13046 generic.go:334] "Generic (PLEG): container finished" podID="6c539a17-b57b-446a-b50d-976adc8766ef" containerID="b615173188d3c6ef1796ab2b05330b5d0d8e92137820ea937309af44c50f2c0e" exitCode=0 Mar 08 03:26:15.729777 master-0 kubenswrapper[13046]: I0308 03:26:15.729665 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"6c539a17-b57b-446a-b50d-976adc8766ef","Type":"ContainerDied","Data":"b615173188d3c6ef1796ab2b05330b5d0d8e92137820ea937309af44c50f2c0e"} Mar 08 03:26:15.735230 master-0 kubenswrapper[13046]: W0308 03:26:15.735150 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8517691b_937c_4cde_a7d2_fe18d6b7193d.slice/crio-37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed WatchSource:0}: Error finding container 37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed: Status 404 returned error can't find the container with id 37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed Mar 08 03:26:15.745771 master-0 kubenswrapper[13046]: I0308 03:26:15.745541 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:26:15.749139 master-0 kubenswrapper[13046]: I0308 03:26:15.747707 13046 scope.go:117] "RemoveContainer" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:15.847503 master-0 kubenswrapper[13046]: I0308 03:26:15.847352 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.847822 master-0 kubenswrapper[13046]: I0308 03:26:15.847789 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848089 master-0 kubenswrapper[13046]: I0308 03:26:15.848063 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848271 master-0 kubenswrapper[13046]: I0308 03:26:15.848246 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848441 master-0 kubenswrapper[13046]: I0308 03:26:15.848418 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848647 master-0 kubenswrapper[13046]: I0308 03:26:15.848622 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74hmp\" (UniqueName: \"kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848807 master-0 kubenswrapper[13046]: I0308 03:26:15.848783 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.848962 master-0 kubenswrapper[13046]: I0308 03:26:15.848939 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.849085 master-0 kubenswrapper[13046]: I0308 03:26:15.849068 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.849206 master-0 kubenswrapper[13046]: I0308 03:26:15.849190 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.849381 master-0 kubenswrapper[13046]: I0308 03:26:15.849356 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.849548 master-0 kubenswrapper[13046]: I0308 03:26:15.849524 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.849712 master-0 kubenswrapper[13046]: I0308 03:26:15.849696 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies\") pod \"1f559362-f339-4de3-9666-757654e9c35e\" (UID: \"1f559362-f339-4de3-9666-757654e9c35e\") " Mar 08 03:26:15.850643 master-0 kubenswrapper[13046]: I0308 03:26:15.850568 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:15.851311 master-0 kubenswrapper[13046]: I0308 03:26:15.851215 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:15.852159 master-0 kubenswrapper[13046]: I0308 03:26:15.852114 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:15.852257 master-0 kubenswrapper[13046]: I0308 03:26:15.852206 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:15.857240 master-0 kubenswrapper[13046]: I0308 03:26:15.857193 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:15.865130 master-0 kubenswrapper[13046]: I0308 03:26:15.865051 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp" (OuterVolumeSpecName: "kube-api-access-74hmp") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "kube-api-access-74hmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:26:15.866617 master-0 kubenswrapper[13046]: I0308 03:26:15.866397 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.866617 master-0 kubenswrapper[13046]: I0308 03:26:15.866454 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.868470 master-0 kubenswrapper[13046]: I0308 03:26:15.868429 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.870271 master-0 kubenswrapper[13046]: I0308 03:26:15.870205 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.876897 master-0 kubenswrapper[13046]: I0308 03:26:15.876849 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.877650 master-0 kubenswrapper[13046]: I0308 03:26:15.876781 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.878419 master-0 kubenswrapper[13046]: I0308 03:26:15.878355 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1f559362-f339-4de3-9666-757654e9c35e" (UID: "1f559362-f339-4de3-9666-757654e9c35e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:15.928146 master-0 kubenswrapper[13046]: I0308 03:26:15.928084 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="b63b5185e5bc481b891676a634cb5625" podUID="7c77ceff52dd4a01c709016eef561173" Mar 08 03:26:15.936969 master-0 kubenswrapper[13046]: I0308 03:26:15.936923 13046 scope.go:117] "RemoveContainer" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:15.952009 master-0 kubenswrapper[13046]: I0308 03:26:15.951960 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952009 master-0 kubenswrapper[13046]: I0308 03:26:15.952015 13046 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952009 master-0 kubenswrapper[13046]: I0308 03:26:15.952031 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952046 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952061 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952074 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952090 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952103 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74hmp\" (UniqueName: \"kubernetes.io/projected/1f559362-f339-4de3-9666-757654e9c35e-kube-api-access-74hmp\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952115 13046 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f559362-f339-4de3-9666-757654e9c35e-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952128 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952143 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952157 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.952193 master-0 kubenswrapper[13046]: I0308 03:26:15.952171 13046 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f559362-f339-4de3-9666-757654e9c35e-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:15.971941 master-0 kubenswrapper[13046]: I0308 03:26:15.971897 13046 scope.go:117] "RemoveContainer" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.016365 master-0 kubenswrapper[13046]: I0308 03:26:16.016298 13046 scope.go:117] "RemoveContainer" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:16.016920 master-0 kubenswrapper[13046]: E0308 03:26:16.016861 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": container with ID starting with 1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5 not found: ID does not exist" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:16.016971 master-0 kubenswrapper[13046]: I0308 03:26:16.016935 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5"} err="failed to get container status \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": rpc error: code = NotFound desc = could not find container \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": container with ID starting with 1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5 not found: ID does not exist" Mar 08 03:26:16.017014 master-0 kubenswrapper[13046]: I0308 03:26:16.016976 13046 scope.go:117] "RemoveContainer" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:16.017301 master-0 kubenswrapper[13046]: E0308 03:26:16.017261 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": container with ID starting with cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a not found: ID does not exist" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:16.017340 master-0 kubenswrapper[13046]: I0308 03:26:16.017302 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a"} err="failed to get container status \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": rpc error: code = NotFound desc = could not find container \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": container with ID starting with cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a not found: ID does not exist" Mar 08 03:26:16.017340 master-0 kubenswrapper[13046]: I0308 03:26:16.017329 13046 scope.go:117] "RemoveContainer" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:16.017722 master-0 kubenswrapper[13046]: E0308 03:26:16.017685 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": container with ID starting with 376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c not found: ID does not exist" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:16.017765 master-0 kubenswrapper[13046]: I0308 03:26:16.017729 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c"} err="failed to get container status \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": rpc error: code = NotFound desc = could not find container \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": container with ID starting with 376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c not found: ID does not exist" Mar 08 03:26:16.017765 master-0 kubenswrapper[13046]: I0308 03:26:16.017754 13046 scope.go:117] "RemoveContainer" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.018214 master-0 kubenswrapper[13046]: E0308 03:26:16.018174 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": container with ID starting with 90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9 not found: ID does not exist" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.018263 master-0 kubenswrapper[13046]: I0308 03:26:16.018217 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9"} err="failed to get container status \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": rpc error: code = NotFound desc = could not find container \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": container with ID starting with 90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9 not found: ID does not exist" Mar 08 03:26:16.018263 master-0 kubenswrapper[13046]: I0308 03:26:16.018243 13046 scope.go:117] "RemoveContainer" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:16.018778 master-0 kubenswrapper[13046]: I0308 03:26:16.018715 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5"} err="failed to get container status \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": rpc error: code = NotFound desc = could not find container \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": container with ID starting with 1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5 not found: ID does not exist" Mar 08 03:26:16.018831 master-0 kubenswrapper[13046]: I0308 03:26:16.018782 13046 scope.go:117] "RemoveContainer" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:16.019070 master-0 kubenswrapper[13046]: I0308 03:26:16.019040 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a"} err="failed to get container status \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": rpc error: code = NotFound desc = could not find container \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": container with ID starting with cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a not found: ID does not exist" Mar 08 03:26:16.019114 master-0 kubenswrapper[13046]: I0308 03:26:16.019071 13046 scope.go:117] "RemoveContainer" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:16.019360 master-0 kubenswrapper[13046]: I0308 03:26:16.019318 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c"} err="failed to get container status \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": rpc error: code = NotFound desc = could not find container \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": container with ID starting with 376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c not found: ID does not exist" Mar 08 03:26:16.019399 master-0 kubenswrapper[13046]: I0308 03:26:16.019356 13046 scope.go:117] "RemoveContainer" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.019671 master-0 kubenswrapper[13046]: I0308 03:26:16.019630 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9"} err="failed to get container status \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": rpc error: code = NotFound desc = could not find container \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": container with ID starting with 90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9 not found: ID does not exist" Mar 08 03:26:16.019763 master-0 kubenswrapper[13046]: I0308 03:26:16.019666 13046 scope.go:117] "RemoveContainer" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:16.019975 master-0 kubenswrapper[13046]: I0308 03:26:16.019938 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5"} err="failed to get container status \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": rpc error: code = NotFound desc = could not find container \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": container with ID starting with 1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5 not found: ID does not exist" Mar 08 03:26:16.020014 master-0 kubenswrapper[13046]: I0308 03:26:16.019972 13046 scope.go:117] "RemoveContainer" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:16.020237 master-0 kubenswrapper[13046]: I0308 03:26:16.020198 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a"} err="failed to get container status \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": rpc error: code = NotFound desc = could not find container \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": container with ID starting with cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a not found: ID does not exist" Mar 08 03:26:16.020272 master-0 kubenswrapper[13046]: I0308 03:26:16.020234 13046 scope.go:117] "RemoveContainer" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:16.020474 master-0 kubenswrapper[13046]: I0308 03:26:16.020445 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c"} err="failed to get container status \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": rpc error: code = NotFound desc = could not find container \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": container with ID starting with 376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c not found: ID does not exist" Mar 08 03:26:16.020548 master-0 kubenswrapper[13046]: I0308 03:26:16.020476 13046 scope.go:117] "RemoveContainer" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.020745 master-0 kubenswrapper[13046]: I0308 03:26:16.020710 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9"} err="failed to get container status \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": rpc error: code = NotFound desc = could not find container \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": container with ID starting with 90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9 not found: ID does not exist" Mar 08 03:26:16.020745 master-0 kubenswrapper[13046]: I0308 03:26:16.020741 13046 scope.go:117] "RemoveContainer" containerID="1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5" Mar 08 03:26:16.020986 master-0 kubenswrapper[13046]: I0308 03:26:16.020944 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5"} err="failed to get container status \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": rpc error: code = NotFound desc = could not find container \"1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5\": container with ID starting with 1ddb3fbce7acb9c019d4af0088ca573f926fe094927677810f56db524e1950d5 not found: ID does not exist" Mar 08 03:26:16.020986 master-0 kubenswrapper[13046]: I0308 03:26:16.020980 13046 scope.go:117] "RemoveContainer" containerID="cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a" Mar 08 03:26:16.021217 master-0 kubenswrapper[13046]: I0308 03:26:16.021187 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a"} err="failed to get container status \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": rpc error: code = NotFound desc = could not find container \"cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a\": container with ID starting with cc9fa88410d9a800c2f66463cfa26a78e37140e161aff8314f97eae84216bc9a not found: ID does not exist" Mar 08 03:26:16.021262 master-0 kubenswrapper[13046]: I0308 03:26:16.021218 13046 scope.go:117] "RemoveContainer" containerID="376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c" Mar 08 03:26:16.021451 master-0 kubenswrapper[13046]: I0308 03:26:16.021423 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c"} err="failed to get container status \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": rpc error: code = NotFound desc = could not find container \"376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c\": container with ID starting with 376bc30168376b379f5440d3ce175b6f3bb2de0de1814953cdff3c65ff67963c not found: ID does not exist" Mar 08 03:26:16.021509 master-0 kubenswrapper[13046]: I0308 03:26:16.021453 13046 scope.go:117] "RemoveContainer" containerID="90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9" Mar 08 03:26:16.021708 master-0 kubenswrapper[13046]: I0308 03:26:16.021678 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9"} err="failed to get container status \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": rpc error: code = NotFound desc = could not find container \"90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9\": container with ID starting with 90ab9ed8268707143dfc682bfd3880caae069fa8b4b41379da866733097e76b9 not found: ID does not exist" Mar 08 03:26:16.126866 master-0 kubenswrapper[13046]: I0308 03:26:16.126752 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63b5185e5bc481b891676a634cb5625" path="/var/lib/kubelet/pods/b63b5185e5bc481b891676a634cb5625/volumes" Mar 08 03:26:16.736954 master-0 kubenswrapper[13046]: I0308 03:26:16.736825 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"1ed6b389e529262c2ccce34e6fd7d1cd11742c4ee42c4c9103ff643781b40d7e"} Mar 08 03:26:16.737190 master-0 kubenswrapper[13046]: I0308 03:26:16.737164 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:26:16.737410 master-0 kubenswrapper[13046]: I0308 03:26:16.737185 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"6d953e4f91b8cabb5a46494d1e8182e5cb0d3265136076c00536d15789e7a26e"} Mar 08 03:26:16.737410 master-0 kubenswrapper[13046]: I0308 03:26:16.737406 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"c073a476594c9f68c9b2d808be060720455e04f7a338388678f6887419a658b2"} Mar 08 03:26:16.738076 master-0 kubenswrapper[13046]: I0308 03:26:16.738044 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" event={"ID":"1f559362-f339-4de3-9666-757654e9c35e","Type":"ContainerDied","Data":"f344d49c740f287f640ec7329405534c81d8db828faa4614d580010632defbdd"} Mar 08 03:26:16.738153 master-0 kubenswrapper[13046]: I0308 03:26:16.738078 13046 scope.go:117] "RemoveContainer" containerID="baf31e0018548ce23470fd372aed90f69f405884c0594cc4cd64bd69dc859f86" Mar 08 03:26:16.738232 master-0 kubenswrapper[13046]: I0308 03:26:16.738208 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff964fcf-s5mbf" Mar 08 03:26:16.742836 master-0 kubenswrapper[13046]: I0308 03:26:16.742780 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"8517691b-937c-4cde-a7d2-fe18d6b7193d","Type":"ContainerStarted","Data":"3899d101e408c7a23497b90981b3128c53aac06bc31d8fc4ebd6216130e8cdcd"} Mar 08 03:26:16.742836 master-0 kubenswrapper[13046]: I0308 03:26:16.742834 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"8517691b-937c-4cde-a7d2-fe18d6b7193d","Type":"ContainerStarted","Data":"37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed"} Mar 08 03:26:16.748902 master-0 kubenswrapper[13046]: I0308 03:26:16.748835 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-h4zwb" event={"ID":"9eb92440-4e70-4fa6-9315-444d6f99e287","Type":"ContainerStarted","Data":"13d5141e001c2f9b6d59f8162655864476b35f24f432d89c26c0624ecf0e8d1d"} Mar 08 03:26:16.749063 master-0 kubenswrapper[13046]: I0308 03:26:16.749042 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:26:16.751805 master-0 kubenswrapper[13046]: I0308 03:26:16.751750 13046 patch_prober.go:28] interesting pod/downloads-84f57b9877-h4zwb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.86:8080/\": dial tcp 10.128.0.86:8080: connect: connection refused" start-of-body= Mar 08 03:26:16.751912 master-0 kubenswrapper[13046]: I0308 03:26:16.751823 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-h4zwb" podUID="9eb92440-4e70-4fa6-9315-444d6f99e287" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.86:8080/\": dial tcp 10.128.0.86:8080: connect: connection refused" Mar 08 03:26:16.767569 master-0 kubenswrapper[13046]: I0308 03:26:16.766680 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=4.76666296 podStartE2EDuration="4.76666296s" podCreationTimestamp="2026-03-08 03:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:26:16.764215422 +0000 UTC m=+778.842982639" watchObservedRunningTime="2026-03-08 03:26:16.76666296 +0000 UTC m=+778.845430187" Mar 08 03:26:16.789980 master-0 kubenswrapper[13046]: I0308 03:26:16.789930 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:26:16.799727 master-0 kubenswrapper[13046]: I0308 03:26:16.799661 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5fff964fcf-s5mbf"] Mar 08 03:26:16.802537 master-0 kubenswrapper[13046]: I0308 03:26:16.802494 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-d69ccb978-jj8tj"] Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: E0308 03:26:16.802770 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: I0308 03:26:16.802790 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: E0308 03:26:16.802830 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d88f12-2fa2-4f01-badf-3543770a14f1" containerName="installer" Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: I0308 03:26:16.802839 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d88f12-2fa2-4f01-badf-3543770a14f1" containerName="installer" Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: I0308 03:26:16.803205 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f559362-f339-4de3-9666-757654e9c35e" containerName="oauth-openshift" Mar 08 03:26:16.803257 master-0 kubenswrapper[13046]: I0308 03:26:16.803247 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d88f12-2fa2-4f01-badf-3543770a14f1" containerName="installer" Mar 08 03:26:16.811228 master-0 kubenswrapper[13046]: I0308 03:26:16.811160 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-h4zwb" podStartSLOduration=1.739705214 podStartE2EDuration="51.811140293s" podCreationTimestamp="2026-03-08 03:25:25 +0000 UTC" firstStartedPulling="2026-03-08 03:25:25.829606024 +0000 UTC m=+727.908373241" lastFinishedPulling="2026-03-08 03:26:15.901041103 +0000 UTC m=+777.979808320" observedRunningTime="2026-03-08 03:26:16.806223667 +0000 UTC m=+778.884990884" watchObservedRunningTime="2026-03-08 03:26:16.811140293 +0000 UTC m=+778.889907510" Mar 08 03:26:16.819302 master-0 kubenswrapper[13046]: I0308 03:26:16.819262 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.823923 master-0 kubenswrapper[13046]: I0308 03:26:16.823621 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d69ccb978-jj8tj"] Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825172 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825587 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825664 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825674 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825725 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825788 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825828 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825842 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-9f7bj" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825923 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825938 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.825963 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 03:26:16.826109 master-0 kubenswrapper[13046]: I0308 03:26:16.826027 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 03:26:16.836002 master-0 kubenswrapper[13046]: I0308 03:26:16.833399 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 03:26:16.842088 master-0 kubenswrapper[13046]: I0308 03:26:16.842016 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=10.841996568999999 podStartE2EDuration="10.841996569s" podCreationTimestamp="2026-03-08 03:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:26:16.833851213 +0000 UTC m=+778.912618440" watchObservedRunningTime="2026-03-08 03:26:16.841996569 +0000 UTC m=+778.920763796" Mar 08 03:26:16.844588 master-0 kubenswrapper[13046]: I0308 03:26:16.844561 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 03:26:16.969793 master-0 kubenswrapper[13046]: I0308 03:26:16.969709 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-policies\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970300 master-0 kubenswrapper[13046]: I0308 03:26:16.970008 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970300 master-0 kubenswrapper[13046]: I0308 03:26:16.970188 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgrm2\" (UniqueName: \"kubernetes.io/projected/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-kube-api-access-rgrm2\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970520 master-0 kubenswrapper[13046]: I0308 03:26:16.970302 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-session\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970520 master-0 kubenswrapper[13046]: I0308 03:26:16.970386 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970610 master-0 kubenswrapper[13046]: I0308 03:26:16.970477 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970659 master-0 kubenswrapper[13046]: I0308 03:26:16.970573 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-error\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970703 master-0 kubenswrapper[13046]: I0308 03:26:16.970657 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970875 master-0 kubenswrapper[13046]: I0308 03:26:16.970840 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970940 master-0 kubenswrapper[13046]: I0308 03:26:16.970899 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.970986 master-0 kubenswrapper[13046]: I0308 03:26:16.970959 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-dir\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.971087 master-0 kubenswrapper[13046]: I0308 03:26:16.971058 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-login\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:16.971140 master-0 kubenswrapper[13046]: I0308 03:26:16.971099 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.073004 master-0 kubenswrapper[13046]: I0308 03:26:17.072894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-login\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.073250 master-0 kubenswrapper[13046]: I0308 03:26:17.073230 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.073389 master-0 kubenswrapper[13046]: I0308 03:26:17.073372 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-policies\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.073617 master-0 kubenswrapper[13046]: I0308 03:26:17.073597 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.073795 master-0 kubenswrapper[13046]: I0308 03:26:17.073776 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgrm2\" (UniqueName: \"kubernetes.io/projected/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-kube-api-access-rgrm2\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.074281 master-0 kubenswrapper[13046]: I0308 03:26:17.074260 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-session\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.075581 master-0 kubenswrapper[13046]: I0308 03:26:17.074819 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.075813 master-0 kubenswrapper[13046]: I0308 03:26:17.075792 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.075936 master-0 kubenswrapper[13046]: I0308 03:26:17.075904 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-error\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.076068 master-0 kubenswrapper[13046]: I0308 03:26:17.076049 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.076401 master-0 kubenswrapper[13046]: I0308 03:26:17.076381 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.076560 master-0 kubenswrapper[13046]: I0308 03:26:17.076542 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.076684 master-0 kubenswrapper[13046]: I0308 03:26:17.076667 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-dir\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080104 master-0 kubenswrapper[13046]: I0308 03:26:17.078062 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-session\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080104 master-0 kubenswrapper[13046]: I0308 03:26:17.074864 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-policies\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080328 master-0 kubenswrapper[13046]: I0308 03:26:17.075039 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080328 master-0 kubenswrapper[13046]: I0308 03:26:17.079333 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080328 master-0 kubenswrapper[13046]: I0308 03:26:17.079402 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080328 master-0 kubenswrapper[13046]: I0308 03:26:17.079444 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-audit-dir\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.080687 master-0 kubenswrapper[13046]: I0308 03:26:17.080542 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.081041 master-0 kubenswrapper[13046]: I0308 03:26:17.081011 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-login\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.084320 master-0 kubenswrapper[13046]: I0308 03:26:17.083857 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-error\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.084320 master-0 kubenswrapper[13046]: I0308 03:26:17.084116 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.084320 master-0 kubenswrapper[13046]: I0308 03:26:17.084283 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.084649 master-0 kubenswrapper[13046]: I0308 03:26:17.084607 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.090464 master-0 kubenswrapper[13046]: I0308 03:26:17.090416 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgrm2\" (UniqueName: \"kubernetes.io/projected/8b8268e3-34e6-4672-b6ab-d9f93dd788d7-kube-api-access-rgrm2\") pod \"oauth-openshift-d69ccb978-jj8tj\" (UID: \"8b8268e3-34e6-4672-b6ab-d9f93dd788d7\") " pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.151940 master-0 kubenswrapper[13046]: I0308 03:26:17.151568 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:26:17.157177 master-0 kubenswrapper[13046]: I0308 03:26:17.157100 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:17.287304 master-0 kubenswrapper[13046]: I0308 03:26:17.287226 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir\") pod \"6c539a17-b57b-446a-b50d-976adc8766ef\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " Mar 08 03:26:17.287477 master-0 kubenswrapper[13046]: I0308 03:26:17.287346 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6c539a17-b57b-446a-b50d-976adc8766ef" (UID: "6c539a17-b57b-446a-b50d-976adc8766ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:17.287477 master-0 kubenswrapper[13046]: I0308 03:26:17.287370 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access\") pod \"6c539a17-b57b-446a-b50d-976adc8766ef\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " Mar 08 03:26:17.287762 master-0 kubenswrapper[13046]: I0308 03:26:17.287547 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock\") pod \"6c539a17-b57b-446a-b50d-976adc8766ef\" (UID: \"6c539a17-b57b-446a-b50d-976adc8766ef\") " Mar 08 03:26:17.287981 master-0 kubenswrapper[13046]: I0308 03:26:17.287858 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock" (OuterVolumeSpecName: "var-lock") pod "6c539a17-b57b-446a-b50d-976adc8766ef" (UID: "6c539a17-b57b-446a-b50d-976adc8766ef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:17.288448 master-0 kubenswrapper[13046]: I0308 03:26:17.288388 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:17.288448 master-0 kubenswrapper[13046]: I0308 03:26:17.288422 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c539a17-b57b-446a-b50d-976adc8766ef-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:17.294696 master-0 kubenswrapper[13046]: I0308 03:26:17.294646 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6c539a17-b57b-446a-b50d-976adc8766ef" (UID: "6c539a17-b57b-446a-b50d-976adc8766ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:26:17.391136 master-0 kubenswrapper[13046]: I0308 03:26:17.390383 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c539a17-b57b-446a-b50d-976adc8766ef-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:17.643510 master-0 kubenswrapper[13046]: I0308 03:26:17.643415 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-d69ccb978-jj8tj"] Mar 08 03:26:17.644303 master-0 kubenswrapper[13046]: W0308 03:26:17.644267 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8268e3_34e6_4672_b6ab_d9f93dd788d7.slice/crio-0a100efd0b0b8bb3a8f6556256bd3daed97d791fb1e82c6e90d31cb5bfb36b65 WatchSource:0}: Error finding container 0a100efd0b0b8bb3a8f6556256bd3daed97d791fb1e82c6e90d31cb5bfb36b65: Status 404 returned error can't find the container with id 0a100efd0b0b8bb3a8f6556256bd3daed97d791fb1e82c6e90d31cb5bfb36b65 Mar 08 03:26:17.762244 master-0 kubenswrapper[13046]: I0308 03:26:17.762189 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"6c539a17-b57b-446a-b50d-976adc8766ef","Type":"ContainerDied","Data":"aab46491c5b58c93bbde4dc1b1fbf43386a5ee6a7bfde311b9d73e2882ce63cf"} Mar 08 03:26:17.762452 master-0 kubenswrapper[13046]: I0308 03:26:17.762253 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aab46491c5b58c93bbde4dc1b1fbf43386a5ee6a7bfde311b9d73e2882ce63cf" Mar 08 03:26:17.762452 master-0 kubenswrapper[13046]: I0308 03:26:17.762340 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 08 03:26:17.768800 master-0 kubenswrapper[13046]: I0308 03:26:17.768730 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" event={"ID":"8b8268e3-34e6-4672-b6ab-d9f93dd788d7","Type":"ContainerStarted","Data":"0a100efd0b0b8bb3a8f6556256bd3daed97d791fb1e82c6e90d31cb5bfb36b65"} Mar 08 03:26:17.769618 master-0 kubenswrapper[13046]: I0308 03:26:17.769569 13046 patch_prober.go:28] interesting pod/downloads-84f57b9877-h4zwb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.86:8080/\": dial tcp 10.128.0.86:8080: connect: connection refused" start-of-body= Mar 08 03:26:17.769709 master-0 kubenswrapper[13046]: I0308 03:26:17.769643 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-h4zwb" podUID="9eb92440-4e70-4fa6-9315-444d6f99e287" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.86:8080/\": dial tcp 10.128.0.86:8080: connect: connection refused" Mar 08 03:26:18.169145 master-0 kubenswrapper[13046]: I0308 03:26:18.168992 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f559362-f339-4de3-9666-757654e9c35e" path="/var/lib/kubelet/pods/1f559362-f339-4de3-9666-757654e9c35e/volumes" Mar 08 03:26:18.351063 master-0 kubenswrapper[13046]: I0308 03:26:18.350999 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-dc7d49677-gsx8f" podUID="158946fc-eae1-4823-a93c-398d4aede495" containerName="console" containerID="cri-o://b4d6c4c4750abb273febb1402324cd536599af35d258e2f54378387537d6e101" gracePeriod=15 Mar 08 03:26:18.493109 master-0 kubenswrapper[13046]: I0308 03:26:18.492980 13046 scope.go:117] "RemoveContainer" containerID="594646acaa0ff62f90dedd27d6db99772884979a7ba85cf829b1a0afe318a6a7" Mar 08 03:26:18.778064 master-0 kubenswrapper[13046]: I0308 03:26:18.778005 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" event={"ID":"8b8268e3-34e6-4672-b6ab-d9f93dd788d7","Type":"ContainerStarted","Data":"427ab8aba95c71fa910cc163f913ecf5e0c45a9bda22df322ee03f5274def350"} Mar 08 03:26:18.779879 master-0 kubenswrapper[13046]: I0308 03:26:18.779802 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:18.781971 master-0 kubenswrapper[13046]: I0308 03:26:18.781942 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dc7d49677-gsx8f_158946fc-eae1-4823-a93c-398d4aede495/console/0.log" Mar 08 03:26:18.782050 master-0 kubenswrapper[13046]: I0308 03:26:18.781974 13046 generic.go:334] "Generic (PLEG): container finished" podID="158946fc-eae1-4823-a93c-398d4aede495" containerID="b4d6c4c4750abb273febb1402324cd536599af35d258e2f54378387537d6e101" exitCode=2 Mar 08 03:26:18.782050 master-0 kubenswrapper[13046]: I0308 03:26:18.781997 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc7d49677-gsx8f" event={"ID":"158946fc-eae1-4823-a93c-398d4aede495","Type":"ContainerDied","Data":"b4d6c4c4750abb273febb1402324cd536599af35d258e2f54378387537d6e101"} Mar 08 03:26:18.801714 master-0 kubenswrapper[13046]: I0308 03:26:18.801674 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" Mar 08 03:26:18.951457 master-0 kubenswrapper[13046]: I0308 03:26:18.951416 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dc7d49677-gsx8f_158946fc-eae1-4823-a93c-398d4aede495/console/0.log" Mar 08 03:26:18.951650 master-0 kubenswrapper[13046]: I0308 03:26:18.951510 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:26:18.955619 master-0 kubenswrapper[13046]: I0308 03:26:18.955558 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-d69ccb978-jj8tj" podStartSLOduration=30.955546833 podStartE2EDuration="30.955546833s" podCreationTimestamp="2026-03-08 03:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:26:18.951158471 +0000 UTC m=+781.029925728" watchObservedRunningTime="2026-03-08 03:26:18.955546833 +0000 UTC m=+781.034314050" Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135284 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135384 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135423 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135536 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135616 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.135732 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn4j6\" (UniqueName: \"kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6\") pod \"158946fc-eae1-4823-a93c-398d4aede495\" (UID: \"158946fc-eae1-4823-a93c-398d4aede495\") " Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.136143 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca" (OuterVolumeSpecName: "service-ca") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:19.137395 master-0 kubenswrapper[13046]: I0308 03:26:19.137120 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config" (OuterVolumeSpecName: "console-config") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:19.138671 master-0 kubenswrapper[13046]: I0308 03:26:19.138608 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:26:19.140169 master-0 kubenswrapper[13046]: I0308 03:26:19.139952 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:19.147592 master-0 kubenswrapper[13046]: I0308 03:26:19.140636 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:26:19.147592 master-0 kubenswrapper[13046]: I0308 03:26:19.141041 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6" (OuterVolumeSpecName: "kube-api-access-vn4j6") pod "158946fc-eae1-4823-a93c-398d4aede495" (UID: "158946fc-eae1-4823-a93c-398d4aede495"). InnerVolumeSpecName "kube-api-access-vn4j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:26:19.239672 master-0 kubenswrapper[13046]: I0308 03:26:19.239605 13046 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.239672 master-0 kubenswrapper[13046]: I0308 03:26:19.239660 13046 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/158946fc-eae1-4823-a93c-398d4aede495-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.239672 master-0 kubenswrapper[13046]: I0308 03:26:19.239681 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn4j6\" (UniqueName: \"kubernetes.io/projected/158946fc-eae1-4823-a93c-398d4aede495-kube-api-access-vn4j6\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.240324 master-0 kubenswrapper[13046]: I0308 03:26:19.239700 13046 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.240324 master-0 kubenswrapper[13046]: I0308 03:26:19.239718 13046 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.240324 master-0 kubenswrapper[13046]: I0308 03:26:19.239737 13046 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/158946fc-eae1-4823-a93c-398d4aede495-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:19.812140 master-0 kubenswrapper[13046]: I0308 03:26:19.812046 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dc7d49677-gsx8f_158946fc-eae1-4823-a93c-398d4aede495/console/0.log" Mar 08 03:26:19.812462 master-0 kubenswrapper[13046]: I0308 03:26:19.812249 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dc7d49677-gsx8f" event={"ID":"158946fc-eae1-4823-a93c-398d4aede495","Type":"ContainerDied","Data":"19f7a1f66bac10972eeab9d6c4173ff7b45da8ecd66dc311e55906df50ba3772"} Mar 08 03:26:19.812462 master-0 kubenswrapper[13046]: I0308 03:26:19.812341 13046 scope.go:117] "RemoveContainer" containerID="b4d6c4c4750abb273febb1402324cd536599af35d258e2f54378387537d6e101" Mar 08 03:26:19.818552 master-0 kubenswrapper[13046]: I0308 03:26:19.812314 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dc7d49677-gsx8f" Mar 08 03:26:20.053186 master-0 kubenswrapper[13046]: I0308 03:26:20.053091 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:26:20.310943 master-0 kubenswrapper[13046]: I0308 03:26:20.310860 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dc7d49677-gsx8f"] Mar 08 03:26:22.118772 master-0 kubenswrapper[13046]: I0308 03:26:22.118689 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:22.142672 master-0 kubenswrapper[13046]: I0308 03:26:22.142594 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158946fc-eae1-4823-a93c-398d4aede495" path="/var/lib/kubelet/pods/158946fc-eae1-4823-a93c-398d4aede495/volumes" Mar 08 03:26:22.147220 master-0 kubenswrapper[13046]: I0308 03:26:22.147152 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ea1c015c-88d4-4228-9936-2c2313aaf81f" Mar 08 03:26:22.147220 master-0 kubenswrapper[13046]: I0308 03:26:22.147210 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ea1c015c-88d4-4228-9936-2c2313aaf81f" Mar 08 03:26:22.173345 master-0 kubenswrapper[13046]: I0308 03:26:22.173258 13046 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:22.178801 master-0 kubenswrapper[13046]: I0308 03:26:22.178732 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:26:22.191771 master-0 kubenswrapper[13046]: I0308 03:26:22.191449 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:22.194455 master-0 kubenswrapper[13046]: I0308 03:26:22.193673 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:26:22.201338 master-0 kubenswrapper[13046]: I0308 03:26:22.201283 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 08 03:26:22.227394 master-0 kubenswrapper[13046]: W0308 03:26:22.227277 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c77ceff52dd4a01c709016eef561173.slice/crio-ffcab8e8a9224e56398127d460b407ca28301c839e49ca3a620a817affea775a WatchSource:0}: Error finding container ffcab8e8a9224e56398127d460b407ca28301c839e49ca3a620a817affea775a: Status 404 returned error can't find the container with id ffcab8e8a9224e56398127d460b407ca28301c839e49ca3a620a817affea775a Mar 08 03:26:22.562703 master-0 kubenswrapper[13046]: I0308 03:26:22.562215 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cfa0b8bd-fa79-4b31-a43d-ccb2709e653b/installer/0.log" Mar 08 03:26:22.562703 master-0 kubenswrapper[13046]: I0308 03:26:22.562307 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:26:22.700764 master-0 kubenswrapper[13046]: I0308 03:26:22.700703 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access\") pod \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " Mar 08 03:26:22.700959 master-0 kubenswrapper[13046]: I0308 03:26:22.700837 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock\") pod \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " Mar 08 03:26:22.700959 master-0 kubenswrapper[13046]: I0308 03:26:22.700896 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir\") pod \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\" (UID: \"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b\") " Mar 08 03:26:22.701078 master-0 kubenswrapper[13046]: I0308 03:26:22.701011 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock" (OuterVolumeSpecName: "var-lock") pod "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" (UID: "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:22.701078 master-0 kubenswrapper[13046]: I0308 03:26:22.701063 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" (UID: "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:26:22.701460 master-0 kubenswrapper[13046]: I0308 03:26:22.701426 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:22.701460 master-0 kubenswrapper[13046]: I0308 03:26:22.701452 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:22.703820 master-0 kubenswrapper[13046]: I0308 03:26:22.703746 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" (UID: "cfa0b8bd-fa79-4b31-a43d-ccb2709e653b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:26:22.803748 master-0 kubenswrapper[13046]: I0308 03:26:22.803687 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:26:22.846866 master-0 kubenswrapper[13046]: I0308 03:26:22.846815 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cfa0b8bd-fa79-4b31-a43d-ccb2709e653b/installer/0.log" Mar 08 03:26:22.847072 master-0 kubenswrapper[13046]: I0308 03:26:22.846901 13046 generic.go:334] "Generic (PLEG): container finished" podID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" containerID="61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24" exitCode=1 Mar 08 03:26:22.847072 master-0 kubenswrapper[13046]: I0308 03:26:22.846990 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b","Type":"ContainerDied","Data":"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24"} Mar 08 03:26:22.847156 master-0 kubenswrapper[13046]: I0308 03:26:22.847005 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 08 03:26:22.847156 master-0 kubenswrapper[13046]: I0308 03:26:22.847113 13046 scope.go:117] "RemoveContainer" containerID="61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24" Mar 08 03:26:22.847233 master-0 kubenswrapper[13046]: I0308 03:26:22.847073 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cfa0b8bd-fa79-4b31-a43d-ccb2709e653b","Type":"ContainerDied","Data":"02f2d6e6b199f8378e5bd76b83e4837ea3acf83f6c358c0f384995a8a9484b9c"} Mar 08 03:26:22.849960 master-0 kubenswrapper[13046]: I0308 03:26:22.849872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"444586abdf1eb02151f4f0d8b659d3442ea238eb9dcd40642fd48ac22b982d0e"} Mar 08 03:26:22.850033 master-0 kubenswrapper[13046]: I0308 03:26:22.849981 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"ffcab8e8a9224e56398127d460b407ca28301c839e49ca3a620a817affea775a"} Mar 08 03:26:22.876316 master-0 kubenswrapper[13046]: I0308 03:26:22.876282 13046 scope.go:117] "RemoveContainer" containerID="61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24" Mar 08 03:26:22.877734 master-0 kubenswrapper[13046]: E0308 03:26:22.877669 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24\": container with ID starting with 61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24 not found: ID does not exist" containerID="61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24" Mar 08 03:26:22.878608 master-0 kubenswrapper[13046]: I0308 03:26:22.877747 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24"} err="failed to get container status \"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24\": rpc error: code = NotFound desc = could not find container \"61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24\": container with ID starting with 61f215d2c0ab69fd13fe77735372a013ed450512f2657ea9e564738af3d5ce24 not found: ID does not exist" Mar 08 03:26:22.913156 master-0 kubenswrapper[13046]: I0308 03:26:22.913098 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:26:22.925327 master-0 kubenswrapper[13046]: I0308 03:26:22.925101 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 08 03:26:23.422590 master-0 kubenswrapper[13046]: I0308 03:26:23.421737 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:23.422590 master-0 kubenswrapper[13046]: I0308 03:26:23.421815 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:23.522509 master-0 kubenswrapper[13046]: I0308 03:26:23.516998 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:23.522509 master-0 kubenswrapper[13046]: I0308 03:26:23.517100 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:26:23.566199 master-0 kubenswrapper[13046]: I0308 03:26:23.565998 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:26:23.571477 master-0 kubenswrapper[13046]: I0308 03:26:23.571387 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-766946c477-ghtbv" Mar 08 03:26:23.864883 master-0 kubenswrapper[13046]: I0308 03:26:23.864827 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"6b07656ab81c9ce4b17cc09d76b919a7da91ea871b80b51cd6b6ab1238cec20c"} Mar 08 03:26:23.864883 master-0 kubenswrapper[13046]: I0308 03:26:23.864878 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"f888f74a5f119bc907570a77b2ac9835e190c2679d610c0dfea1cdc83f894f9e"} Mar 08 03:26:23.864883 master-0 kubenswrapper[13046]: I0308 03:26:23.864893 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"83315a3167b78c9e1840120dea829816e54d1be23fb46bd20d9f7953fe4b1845"} Mar 08 03:26:23.896206 master-0 kubenswrapper[13046]: I0308 03:26:23.896110 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.8960829590000001 podStartE2EDuration="1.896082959s" podCreationTimestamp="2026-03-08 03:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:26:23.891836882 +0000 UTC m=+785.970604139" watchObservedRunningTime="2026-03-08 03:26:23.896082959 +0000 UTC m=+785.974850186" Mar 08 03:26:24.126337 master-0 kubenswrapper[13046]: I0308 03:26:24.126216 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" path="/var/lib/kubelet/pods/cfa0b8bd-fa79-4b31-a43d-ccb2709e653b/volumes" Mar 08 03:26:25.385205 master-0 kubenswrapper[13046]: I0308 03:26:25.385016 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-h4zwb" Mar 08 03:26:32.192868 master-0 kubenswrapper[13046]: I0308 03:26:32.192769 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.192868 master-0 kubenswrapper[13046]: I0308 03:26:32.192862 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.192868 master-0 kubenswrapper[13046]: I0308 03:26:32.192885 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.194039 master-0 kubenswrapper[13046]: I0308 03:26:32.192904 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.200135 master-0 kubenswrapper[13046]: I0308 03:26:32.200068 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.200288 master-0 kubenswrapper[13046]: I0308 03:26:32.200168 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.968761 master-0 kubenswrapper[13046]: I0308 03:26:32.968675 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:32.974220 master-0 kubenswrapper[13046]: I0308 03:26:32.974165 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:26:33.421745 master-0 kubenswrapper[13046]: I0308 03:26:33.421646 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:33.421745 master-0 kubenswrapper[13046]: I0308 03:26:33.421729 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:33.514472 master-0 kubenswrapper[13046]: I0308 03:26:33.514395 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:33.514935 master-0 kubenswrapper[13046]: I0308 03:26:33.514888 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:26:43.421675 master-0 kubenswrapper[13046]: I0308 03:26:43.421605 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:43.422236 master-0 kubenswrapper[13046]: I0308 03:26:43.421682 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:43.513788 master-0 kubenswrapper[13046]: I0308 03:26:43.513712 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:43.514009 master-0 kubenswrapper[13046]: I0308 03:26:43.513791 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:26:46.609769 master-0 kubenswrapper[13046]: I0308 03:26:46.609675 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:26:46.659537 master-0 kubenswrapper[13046]: I0308 03:26:46.659428 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:26:47.174434 master-0 kubenswrapper[13046]: I0308 03:26:47.174350 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:26:53.422130 master-0 kubenswrapper[13046]: I0308 03:26:53.421951 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:26:53.422130 master-0 kubenswrapper[13046]: I0308 03:26:53.422064 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:26:53.513994 master-0 kubenswrapper[13046]: I0308 03:26:53.513904 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:26:53.514261 master-0 kubenswrapper[13046]: I0308 03:26:53.514008 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:27:00.378033 master-0 kubenswrapper[13046]: I0308 03:27:00.377954 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:00.378952 master-0 kubenswrapper[13046]: I0308 03:27:00.378420 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="alertmanager" containerID="cri-o://b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0" gracePeriod=120 Mar 08 03:27:00.378952 master-0 kubenswrapper[13046]: I0308 03:27:00.378535 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-web" containerID="cri-o://d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392" gracePeriod=120 Mar 08 03:27:00.378952 master-0 kubenswrapper[13046]: I0308 03:27:00.378637 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="prom-label-proxy" containerID="cri-o://90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15" gracePeriod=120 Mar 08 03:27:00.378952 master-0 kubenswrapper[13046]: I0308 03:27:00.378556 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy" containerID="cri-o://c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb" gracePeriod=120 Mar 08 03:27:00.378952 master-0 kubenswrapper[13046]: I0308 03:27:00.378543 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="config-reloader" containerID="cri-o://38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301" gracePeriod=120 Mar 08 03:27:00.380356 master-0 kubenswrapper[13046]: I0308 03:27:00.380216 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-metric" containerID="cri-o://e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b" gracePeriod=120 Mar 08 03:27:01.283724 master-0 kubenswrapper[13046]: I0308 03:27:01.283648 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15" exitCode=0 Mar 08 03:27:01.283724 master-0 kubenswrapper[13046]: I0308 03:27:01.283694 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b" exitCode=0 Mar 08 03:27:01.283724 master-0 kubenswrapper[13046]: I0308 03:27:01.283709 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb" exitCode=0 Mar 08 03:27:01.283724 master-0 kubenswrapper[13046]: I0308 03:27:01.283723 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301" exitCode=0 Mar 08 03:27:01.283724 master-0 kubenswrapper[13046]: I0308 03:27:01.283773 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0" exitCode=0 Mar 08 03:27:01.284308 master-0 kubenswrapper[13046]: I0308 03:27:01.283808 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15"} Mar 08 03:27:01.284308 master-0 kubenswrapper[13046]: I0308 03:27:01.283909 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b"} Mar 08 03:27:01.284308 master-0 kubenswrapper[13046]: I0308 03:27:01.283944 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb"} Mar 08 03:27:01.284308 master-0 kubenswrapper[13046]: I0308 03:27:01.283974 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301"} Mar 08 03:27:01.284308 master-0 kubenswrapper[13046]: I0308 03:27:01.284002 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0"} Mar 08 03:27:02.017957 master-0 kubenswrapper[13046]: I0308 03:27:02.017893 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.127786 master-0 kubenswrapper[13046]: I0308 03:27:02.127726 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.127972 master-0 kubenswrapper[13046]: I0308 03:27:02.127802 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.127972 master-0 kubenswrapper[13046]: I0308 03:27:02.127847 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.127972 master-0 kubenswrapper[13046]: I0308 03:27:02.127912 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128139 master-0 kubenswrapper[13046]: I0308 03:27:02.127984 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128139 master-0 kubenswrapper[13046]: I0308 03:27:02.128047 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128227 master-0 kubenswrapper[13046]: I0308 03:27:02.128152 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128227 master-0 kubenswrapper[13046]: I0308 03:27:02.128200 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128434 master-0 kubenswrapper[13046]: I0308 03:27:02.128251 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128434 master-0 kubenswrapper[13046]: I0308 03:27:02.128303 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128434 master-0 kubenswrapper[13046]: I0308 03:27:02.128357 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms88d\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.128434 master-0 kubenswrapper[13046]: I0308 03:27:02.128399 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume\") pod \"36325b45-a65e-4b32-9db9-3e4b31bcf287\" (UID: \"36325b45-a65e-4b32-9db9-3e4b31bcf287\") " Mar 08 03:27:02.130052 master-0 kubenswrapper[13046]: I0308 03:27:02.128348 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:27:02.130423 master-0 kubenswrapper[13046]: I0308 03:27:02.130368 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:27:02.130547 master-0 kubenswrapper[13046]: I0308 03:27:02.130459 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:27:02.132811 master-0 kubenswrapper[13046]: I0308 03:27:02.132753 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:27:02.132902 master-0 kubenswrapper[13046]: I0308 03:27:02.132750 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.133314 master-0 kubenswrapper[13046]: I0308 03:27:02.133269 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.133573 master-0 kubenswrapper[13046]: I0308 03:27:02.133175 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out" (OuterVolumeSpecName: "config-out") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:27:02.133573 master-0 kubenswrapper[13046]: I0308 03:27:02.133401 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume" (OuterVolumeSpecName: "config-volume") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.133781 master-0 kubenswrapper[13046]: I0308 03:27:02.133745 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.134005 master-0 kubenswrapper[13046]: I0308 03:27:02.133959 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.134648 master-0 kubenswrapper[13046]: I0308 03:27:02.134605 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d" (OuterVolumeSpecName: "kube-api-access-ms88d") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "kube-api-access-ms88d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:27:02.190680 master-0 kubenswrapper[13046]: I0308 03:27:02.190607 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config" (OuterVolumeSpecName: "web-config") pod "36325b45-a65e-4b32-9db9-3e4b31bcf287" (UID: "36325b45-a65e-4b32-9db9-3e4b31bcf287"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231022 13046 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231235 13046 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231261 13046 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231285 13046 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231306 13046 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36325b45-a65e-4b32-9db9-3e4b31bcf287-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231319 master-0 kubenswrapper[13046]: I0308 03:27:02.231328 13046 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231350 13046 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231370 13046 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231461 13046 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231608 13046 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36325b45-a65e-4b32-9db9-3e4b31bcf287-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231630 13046 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36325b45-a65e-4b32-9db9-3e4b31bcf287-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.231778 master-0 kubenswrapper[13046]: I0308 03:27:02.231651 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms88d\" (UniqueName: \"kubernetes.io/projected/36325b45-a65e-4b32-9db9-3e4b31bcf287-kube-api-access-ms88d\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:02.303065 master-0 kubenswrapper[13046]: I0308 03:27:02.301310 13046 generic.go:334] "Generic (PLEG): container finished" podID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerID="d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392" exitCode=0 Mar 08 03:27:02.303065 master-0 kubenswrapper[13046]: I0308 03:27:02.301364 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392"} Mar 08 03:27:02.303065 master-0 kubenswrapper[13046]: I0308 03:27:02.301395 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"36325b45-a65e-4b32-9db9-3e4b31bcf287","Type":"ContainerDied","Data":"0cd6f3481c6d84187e23be93062fbf80fc34bdc67d26a48fc364b934826d50e4"} Mar 08 03:27:02.303065 master-0 kubenswrapper[13046]: I0308 03:27:02.301415 13046 scope.go:117] "RemoveContainer" containerID="90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15" Mar 08 03:27:02.303065 master-0 kubenswrapper[13046]: I0308 03:27:02.301450 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.322733 master-0 kubenswrapper[13046]: I0308 03:27:02.322696 13046 scope.go:117] "RemoveContainer" containerID="e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b" Mar 08 03:27:02.349050 master-0 kubenswrapper[13046]: I0308 03:27:02.348625 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:02.349344 master-0 kubenswrapper[13046]: I0308 03:27:02.349130 13046 scope.go:117] "RemoveContainer" containerID="c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb" Mar 08 03:27:02.352129 master-0 kubenswrapper[13046]: I0308 03:27:02.352061 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:02.364583 master-0 kubenswrapper[13046]: I0308 03:27:02.364345 13046 scope.go:117] "RemoveContainer" containerID="d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392" Mar 08 03:27:02.386145 master-0 kubenswrapper[13046]: I0308 03:27:02.386094 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:02.386377 master-0 kubenswrapper[13046]: E0308 03:27:02.386354 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="prom-label-proxy" Mar 08 03:27:02.386377 master-0 kubenswrapper[13046]: I0308 03:27:02.386371 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="prom-label-proxy" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: E0308 03:27:02.386402 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-metric" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: I0308 03:27:02.386409 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-metric" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: E0308 03:27:02.386419 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: I0308 03:27:02.386426 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: E0308 03:27:02.386437 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="alertmanager" Mar 08 03:27:02.386443 master-0 kubenswrapper[13046]: I0308 03:27:02.386443 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="alertmanager" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386454 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="init-config-reloader" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386462 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="init-config-reloader" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386472 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c539a17-b57b-446a-b50d-976adc8766ef" containerName="installer" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386481 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c539a17-b57b-446a-b50d-976adc8766ef" containerName="installer" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386507 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="config-reloader" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386516 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="config-reloader" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386528 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158946fc-eae1-4823-a93c-398d4aede495" containerName="console" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386537 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="158946fc-eae1-4823-a93c-398d4aede495" containerName="console" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386548 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-web" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386555 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-web" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: E0308 03:27:02.386567 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" containerName="installer" Mar 08 03:27:02.386630 master-0 kubenswrapper[13046]: I0308 03:27:02.386574 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" containerName="installer" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386700 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-metric" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386718 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa0b8bd-fa79-4b31-a43d-ccb2709e653b" containerName="installer" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386727 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c539a17-b57b-446a-b50d-976adc8766ef" containerName="installer" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386743 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="alertmanager" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386757 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy-web" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386767 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="158946fc-eae1-4823-a93c-398d4aede495" containerName="console" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386782 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="config-reloader" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386796 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="kube-rbac-proxy" Mar 08 03:27:02.386949 master-0 kubenswrapper[13046]: I0308 03:27:02.386808 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" containerName="prom-label-proxy" Mar 08 03:27:02.388610 master-0 kubenswrapper[13046]: I0308 03:27:02.388589 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.392059 master-0 kubenswrapper[13046]: I0308 03:27:02.392004 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 03:27:02.392166 master-0 kubenswrapper[13046]: I0308 03:27:02.392149 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 03:27:02.392263 master-0 kubenswrapper[13046]: I0308 03:27:02.392234 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 03:27:02.392313 master-0 kubenswrapper[13046]: I0308 03:27:02.392279 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 03:27:02.392364 master-0 kubenswrapper[13046]: I0308 03:27:02.392255 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-hrrx2" Mar 08 03:27:02.392630 master-0 kubenswrapper[13046]: I0308 03:27:02.392613 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 03:27:02.392711 master-0 kubenswrapper[13046]: I0308 03:27:02.392696 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 03:27:02.392753 master-0 kubenswrapper[13046]: I0308 03:27:02.392707 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 03:27:02.412604 master-0 kubenswrapper[13046]: I0308 03:27:02.403312 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 03:27:02.412604 master-0 kubenswrapper[13046]: I0308 03:27:02.403505 13046 scope.go:117] "RemoveContainer" containerID="38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301" Mar 08 03:27:02.433971 master-0 kubenswrapper[13046]: I0308 03:27:02.433924 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:02.440157 master-0 kubenswrapper[13046]: I0308 03:27:02.440050 13046 scope.go:117] "RemoveContainer" containerID="b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0" Mar 08 03:27:02.458322 master-0 kubenswrapper[13046]: I0308 03:27:02.458046 13046 scope.go:117] "RemoveContainer" containerID="7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c" Mar 08 03:27:02.474598 master-0 kubenswrapper[13046]: I0308 03:27:02.473940 13046 scope.go:117] "RemoveContainer" containerID="90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15" Mar 08 03:27:02.475187 master-0 kubenswrapper[13046]: E0308 03:27:02.474948 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15\": container with ID starting with 90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15 not found: ID does not exist" containerID="90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15" Mar 08 03:27:02.475187 master-0 kubenswrapper[13046]: I0308 03:27:02.474995 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15"} err="failed to get container status \"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15\": rpc error: code = NotFound desc = could not find container \"90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15\": container with ID starting with 90128c13aad4e44d2d5ebebc88037d6ce488375357b1855924ca8be9ddd61a15 not found: ID does not exist" Mar 08 03:27:02.475187 master-0 kubenswrapper[13046]: I0308 03:27:02.475022 13046 scope.go:117] "RemoveContainer" containerID="e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b" Mar 08 03:27:02.475755 master-0 kubenswrapper[13046]: E0308 03:27:02.475637 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b\": container with ID starting with e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b not found: ID does not exist" containerID="e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b" Mar 08 03:27:02.475755 master-0 kubenswrapper[13046]: I0308 03:27:02.475660 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b"} err="failed to get container status \"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b\": rpc error: code = NotFound desc = could not find container \"e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b\": container with ID starting with e5adeeab3ab417e07bae10242ee500f502ee123f492357bb408e22047f5d9a1b not found: ID does not exist" Mar 08 03:27:02.475755 master-0 kubenswrapper[13046]: I0308 03:27:02.475679 13046 scope.go:117] "RemoveContainer" containerID="c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb" Mar 08 03:27:02.476211 master-0 kubenswrapper[13046]: E0308 03:27:02.476118 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb\": container with ID starting with c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb not found: ID does not exist" containerID="c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb" Mar 08 03:27:02.476211 master-0 kubenswrapper[13046]: I0308 03:27:02.476167 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb"} err="failed to get container status \"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb\": rpc error: code = NotFound desc = could not find container \"c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb\": container with ID starting with c6b09056c6cf55a19151fff1471d6218daf541120c725aa491ecbfc8502abdbb not found: ID does not exist" Mar 08 03:27:02.476211 master-0 kubenswrapper[13046]: I0308 03:27:02.476195 13046 scope.go:117] "RemoveContainer" containerID="d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392" Mar 08 03:27:02.476576 master-0 kubenswrapper[13046]: E0308 03:27:02.476468 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392\": container with ID starting with d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392 not found: ID does not exist" containerID="d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392" Mar 08 03:27:02.476576 master-0 kubenswrapper[13046]: I0308 03:27:02.476557 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392"} err="failed to get container status \"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392\": rpc error: code = NotFound desc = could not find container \"d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392\": container with ID starting with d9e590cd26a02e6f8256bf2bfae0154839018bc47dc53e5497230aec1a549392 not found: ID does not exist" Mar 08 03:27:02.476691 master-0 kubenswrapper[13046]: I0308 03:27:02.476586 13046 scope.go:117] "RemoveContainer" containerID="38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301" Mar 08 03:27:02.477004 master-0 kubenswrapper[13046]: E0308 03:27:02.476957 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301\": container with ID starting with 38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301 not found: ID does not exist" containerID="38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301" Mar 08 03:27:02.477061 master-0 kubenswrapper[13046]: I0308 03:27:02.477005 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301"} err="failed to get container status \"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301\": rpc error: code = NotFound desc = could not find container \"38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301\": container with ID starting with 38772116eef981906e9dc71e2c0f471ce148ddc594e9295f9646d012fc71d301 not found: ID does not exist" Mar 08 03:27:02.477061 master-0 kubenswrapper[13046]: I0308 03:27:02.477027 13046 scope.go:117] "RemoveContainer" containerID="b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0" Mar 08 03:27:02.478699 master-0 kubenswrapper[13046]: E0308 03:27:02.477398 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0\": container with ID starting with b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0 not found: ID does not exist" containerID="b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0" Mar 08 03:27:02.478699 master-0 kubenswrapper[13046]: I0308 03:27:02.477428 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0"} err="failed to get container status \"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0\": rpc error: code = NotFound desc = could not find container \"b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0\": container with ID starting with b69b1f820375e0d0254df7d9780cb932e2583043af3946051c143cbe33219fa0 not found: ID does not exist" Mar 08 03:27:02.478699 master-0 kubenswrapper[13046]: I0308 03:27:02.477449 13046 scope.go:117] "RemoveContainer" containerID="7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c" Mar 08 03:27:02.479446 master-0 kubenswrapper[13046]: E0308 03:27:02.479419 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c\": container with ID starting with 7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c not found: ID does not exist" containerID="7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c" Mar 08 03:27:02.479530 master-0 kubenswrapper[13046]: I0308 03:27:02.479449 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c"} err="failed to get container status \"7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c\": rpc error: code = NotFound desc = could not find container \"7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c\": container with ID starting with 7a9a62037331b75fb745474a6ab021a20ba796809098b97712340b3ee7ff2a3c not found: ID does not exist" Mar 08 03:27:02.497736 master-0 kubenswrapper[13046]: I0308 03:27:02.497634 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 08 03:27:02.538246 master-0 kubenswrapper[13046]: I0308 03:27:02.538180 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538435 master-0 kubenswrapper[13046]: I0308 03:27:02.538349 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538435 master-0 kubenswrapper[13046]: I0308 03:27:02.538402 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538524 master-0 kubenswrapper[13046]: I0308 03:27:02.538510 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538579 master-0 kubenswrapper[13046]: I0308 03:27:02.538558 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538718 master-0 kubenswrapper[13046]: I0308 03:27:02.538677 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538762 master-0 kubenswrapper[13046]: I0308 03:27:02.538725 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538821 master-0 kubenswrapper[13046]: I0308 03:27:02.538800 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538965 master-0 kubenswrapper[13046]: I0308 03:27:02.538879 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzp9x\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-kube-api-access-vzp9x\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.538965 master-0 kubenswrapper[13046]: I0308 03:27:02.538957 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.539058 master-0 kubenswrapper[13046]: I0308 03:27:02.539010 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.539097 master-0 kubenswrapper[13046]: I0308 03:27:02.539068 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.640842 master-0 kubenswrapper[13046]: I0308 03:27:02.640766 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641073 master-0 kubenswrapper[13046]: I0308 03:27:02.640940 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641073 master-0 kubenswrapper[13046]: I0308 03:27:02.640978 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641073 master-0 kubenswrapper[13046]: I0308 03:27:02.640994 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641073 master-0 kubenswrapper[13046]: I0308 03:27:02.641027 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641431 master-0 kubenswrapper[13046]: I0308 03:27:02.641392 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641537 master-0 kubenswrapper[13046]: I0308 03:27:02.641511 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641606 master-0 kubenswrapper[13046]: I0308 03:27:02.641581 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641656 master-0 kubenswrapper[13046]: I0308 03:27:02.641640 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641694 master-0 kubenswrapper[13046]: I0308 03:27:02.641664 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641730 master-0 kubenswrapper[13046]: I0308 03:27:02.641709 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.641780 master-0 kubenswrapper[13046]: I0308 03:27:02.641763 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzp9x\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-kube-api-access-vzp9x\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.643138 master-0 kubenswrapper[13046]: I0308 03:27:02.643107 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.643467 master-0 kubenswrapper[13046]: I0308 03:27:02.643440 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.643535 master-0 kubenswrapper[13046]: I0308 03:27:02.643459 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb45ce29-bddc-46ac-a337-0ca37b46714e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.644504 master-0 kubenswrapper[13046]: I0308 03:27:02.644452 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-volume\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.644639 master-0 kubenswrapper[13046]: I0308 03:27:02.644607 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eb45ce29-bddc-46ac-a337-0ca37b46714e-config-out\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.645120 master-0 kubenswrapper[13046]: I0308 03:27:02.645080 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-web-config\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.645404 master-0 kubenswrapper[13046]: I0308 03:27:02.645380 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.649025 master-0 kubenswrapper[13046]: I0308 03:27:02.648984 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.649152 master-0 kubenswrapper[13046]: I0308 03:27:02.649117 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.649209 master-0 kubenswrapper[13046]: I0308 03:27:02.649189 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eb45ce29-bddc-46ac-a337-0ca37b46714e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.649300 master-0 kubenswrapper[13046]: I0308 03:27:02.649269 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.657356 master-0 kubenswrapper[13046]: I0308 03:27:02.657311 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzp9x\" (UniqueName: \"kubernetes.io/projected/eb45ce29-bddc-46ac-a337-0ca37b46714e-kube-api-access-vzp9x\") pod \"alertmanager-main-0\" (UID: \"eb45ce29-bddc-46ac-a337-0ca37b46714e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:02.733059 master-0 kubenswrapper[13046]: I0308 03:27:02.732996 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 08 03:27:03.240748 master-0 kubenswrapper[13046]: I0308 03:27:03.240141 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 08 03:27:03.310212 master-0 kubenswrapper[13046]: I0308 03:27:03.310169 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"dd35f242b8686812bc921a1de3ce5f3584717945d475522202e3b9b0b155228d"} Mar 08 03:27:03.421737 master-0 kubenswrapper[13046]: I0308 03:27:03.421655 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:27:03.421737 master-0 kubenswrapper[13046]: I0308 03:27:03.421731 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:27:03.513391 master-0 kubenswrapper[13046]: I0308 03:27:03.513186 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:27:03.513391 master-0 kubenswrapper[13046]: I0308 03:27:03.513306 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:27:03.944004 master-0 kubenswrapper[13046]: I0308 03:27:03.943943 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:27:03.945114 master-0 kubenswrapper[13046]: I0308 03:27:03.945077 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:03.945828 master-0 kubenswrapper[13046]: I0308 03:27:03.945782 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:27:03.946408 master-0 kubenswrapper[13046]: I0308 03:27:03.946365 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" containerID="cri-o://7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" gracePeriod=15 Mar 08 03:27:03.946605 master-0 kubenswrapper[13046]: I0308 03:27:03.946552 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" containerID="cri-o://893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4" gracePeriod=15 Mar 08 03:27:03.946712 master-0 kubenswrapper[13046]: I0308 03:27:03.946416 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b" gracePeriod=15 Mar 08 03:27:03.946712 master-0 kubenswrapper[13046]: I0308 03:27:03.946432 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b" gracePeriod=15 Mar 08 03:27:03.946888 master-0 kubenswrapper[13046]: I0308 03:27:03.946462 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62" gracePeriod=15 Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.948796 13046 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949158 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949183 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949204 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949215 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949233 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949245 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949271 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949284 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949309 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949321 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: E0308 03:27:03.949353 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949361 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949564 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949582 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949597 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949622 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 08 03:27:03.949867 master-0 kubenswrapper[13046]: I0308 03:27:03.949642 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 08 03:27:04.075264 master-0 kubenswrapper[13046]: I0308 03:27:04.075229 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075286 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075304 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075323 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075341 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075368 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.075406 master-0 kubenswrapper[13046]: I0308 03:27:04.075404 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.075704 master-0 kubenswrapper[13046]: I0308 03:27:04.075441 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.079618 master-0 kubenswrapper[13046]: E0308 03:27:04.079453 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.128954 master-0 kubenswrapper[13046]: I0308 03:27:04.128897 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36325b45-a65e-4b32-9db9-3e4b31bcf287" path="/var/lib/kubelet/pods/36325b45-a65e-4b32-9db9-3e4b31bcf287/volumes" Mar 08 03:27:04.176230 master-0 kubenswrapper[13046]: I0308 03:27:04.176174 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176230 master-0 kubenswrapper[13046]: I0308 03:27:04.176217 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176230 master-0 kubenswrapper[13046]: I0308 03:27:04.176238 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176445 master-0 kubenswrapper[13046]: I0308 03:27:04.176257 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176445 master-0 kubenswrapper[13046]: I0308 03:27:04.176355 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176620 master-0 kubenswrapper[13046]: I0308 03:27:04.176451 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176620 master-0 kubenswrapper[13046]: I0308 03:27:04.176518 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176620 master-0 kubenswrapper[13046]: I0308 03:27:04.176611 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176710 master-0 kubenswrapper[13046]: I0308 03:27:04.176616 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176710 master-0 kubenswrapper[13046]: I0308 03:27:04.176669 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176710 master-0 kubenswrapper[13046]: I0308 03:27:04.176685 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176814 master-0 kubenswrapper[13046]: I0308 03:27:04.176760 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:04.176875 master-0 kubenswrapper[13046]: I0308 03:27:04.176847 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176913 master-0 kubenswrapper[13046]: I0308 03:27:04.176885 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.176945 master-0 kubenswrapper[13046]: I0308 03:27:04.176930 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.177023 master-0 kubenswrapper[13046]: I0308 03:27:04.176998 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.324147 master-0 kubenswrapper[13046]: I0308 03:27:04.324032 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 03:27:04.325252 master-0 kubenswrapper[13046]: I0308 03:27:04.325196 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62" exitCode=0 Mar 08 03:27:04.325298 master-0 kubenswrapper[13046]: I0308 03:27:04.325243 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b" exitCode=0 Mar 08 03:27:04.325364 master-0 kubenswrapper[13046]: I0308 03:27:04.325303 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b" exitCode=0 Mar 08 03:27:04.325364 master-0 kubenswrapper[13046]: I0308 03:27:04.325323 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4" exitCode=2 Mar 08 03:27:04.328384 master-0 kubenswrapper[13046]: I0308 03:27:04.328357 13046 generic.go:334] "Generic (PLEG): container finished" podID="eb45ce29-bddc-46ac-a337-0ca37b46714e" containerID="cab9af27a83433f5ee96392dc4b02d93d7652bf46fb86e1d19ca32b153a07980" exitCode=0 Mar 08 03:27:04.328455 master-0 kubenswrapper[13046]: I0308 03:27:04.328403 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerDied","Data":"cab9af27a83433f5ee96392dc4b02d93d7652bf46fb86e1d19ca32b153a07980"} Mar 08 03:27:04.330161 master-0 kubenswrapper[13046]: I0308 03:27:04.330077 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:04.332388 master-0 kubenswrapper[13046]: E0308 03:27:04.332226 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{alertmanager-main-0.189abfec282b9b47 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:alertmanager-main-0,UID:eb45ce29-bddc-46ac-a337-0ca37b46714e,APIVersion:v1,ResourceVersion:16085,FieldPath:spec.containers{alertmanager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e43499c79a8b5d642b3376af9595daaf45f91b3f616c93b24155f0d47003963\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:27:04.330844999 +0000 UTC m=+826.409612226,LastTimestamp:2026-03-08 03:27:04.330844999 +0000 UTC m=+826.409612226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:27:04.380321 master-0 kubenswrapper[13046]: I0308 03:27:04.380242 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:04.411791 master-0 kubenswrapper[13046]: W0308 03:27:04.411735 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda814bd60de133d95cf99630a978c017e.slice/crio-0ad9774f254bbc6b99a15e79035034e11c6a9a0099dd6d13d22df4af2ed604b3 WatchSource:0}: Error finding container 0ad9774f254bbc6b99a15e79035034e11c6a9a0099dd6d13d22df4af2ed604b3: Status 404 returned error can't find the container with id 0ad9774f254bbc6b99a15e79035034e11c6a9a0099dd6d13d22df4af2ed604b3 Mar 08 03:27:04.635931 master-0 kubenswrapper[13046]: I0308 03:27:04.635633 13046 patch_prober.go:28] interesting pod/machine-config-daemon-j6n9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 03:27:04.636051 master-0 kubenswrapper[13046]: I0308 03:27:04.635952 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j6n9g" podUID="1092f2a6-865c-4706-bba7-068621e85ebc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 03:27:05.346156 master-0 kubenswrapper[13046]: I0308 03:27:05.346093 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"66808eb820b32113f76fb618ffa2c085c627f81938d3148961b3141c1fd4e113"} Mar 08 03:27:05.346848 master-0 kubenswrapper[13046]: I0308 03:27:05.346182 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"d8709346749d0b5c75ddf14159fe4a6aedab3cd53fa944721fd1f553d59fc995"} Mar 08 03:27:05.346848 master-0 kubenswrapper[13046]: I0308 03:27:05.346217 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"cde4f9256b5bf75cba4a221e3a43416fffc295f93eb553cb4a87f2b582da91c2"} Mar 08 03:27:05.346848 master-0 kubenswrapper[13046]: I0308 03:27:05.346244 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"c653906b079f173e20880f848d549b74c3aeef0c185a74759b31b88259a06438"} Mar 08 03:27:05.346848 master-0 kubenswrapper[13046]: I0308 03:27:05.346263 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"cad061e6e30869755190bdddc7ed03e6bedb539d372e4dacecc26b476bfae73d"} Mar 08 03:27:05.348667 master-0 kubenswrapper[13046]: I0308 03:27:05.348624 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3"} Mar 08 03:27:05.349536 master-0 kubenswrapper[13046]: I0308 03:27:05.349469 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"0ad9774f254bbc6b99a15e79035034e11c6a9a0099dd6d13d22df4af2ed604b3"} Mar 08 03:27:05.350354 master-0 kubenswrapper[13046]: I0308 03:27:05.350306 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.350463 master-0 kubenswrapper[13046]: E0308 03:27:05.350398 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:05.869035 master-0 kubenswrapper[13046]: E0308 03:27:05.868926 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.870008 master-0 kubenswrapper[13046]: E0308 03:27:05.869950 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.871046 master-0 kubenswrapper[13046]: E0308 03:27:05.870987 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.872016 master-0 kubenswrapper[13046]: E0308 03:27:05.871966 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.872989 master-0 kubenswrapper[13046]: E0308 03:27:05.872928 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:05.872989 master-0 kubenswrapper[13046]: E0308 03:27:05.872976 13046 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:27:06.364057 master-0 kubenswrapper[13046]: I0308 03:27:06.363981 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eb45ce29-bddc-46ac-a337-0ca37b46714e","Type":"ContainerStarted","Data":"9f46b69eb39e3587c49fc163415526075e23d486b52417fc9293667d7950d2f6"} Mar 08 03:27:06.365118 master-0 kubenswrapper[13046]: E0308 03:27:06.365083 13046 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:06.365163 master-0 kubenswrapper[13046]: I0308 03:27:06.365116 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:06.425698 master-0 kubenswrapper[13046]: I0308 03:27:06.425611 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 03:27:06.426769 master-0 kubenswrapper[13046]: I0308 03:27:06.426708 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:06.428340 master-0 kubenswrapper[13046]: I0308 03:27:06.428253 13046 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:06.429224 master-0 kubenswrapper[13046]: I0308 03:27:06.429159 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:06.620504 master-0 kubenswrapper[13046]: I0308 03:27:06.620346 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 03:27:06.620788 master-0 kubenswrapper[13046]: I0308 03:27:06.620571 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:06.620788 master-0 kubenswrapper[13046]: I0308 03:27:06.620641 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 03:27:06.620788 master-0 kubenswrapper[13046]: I0308 03:27:06.620733 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:06.621234 master-0 kubenswrapper[13046]: I0308 03:27:06.621154 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 08 03:27:06.621455 master-0 kubenswrapper[13046]: I0308 03:27:06.621197 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:06.622238 master-0 kubenswrapper[13046]: I0308 03:27:06.622180 13046 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:06.622348 master-0 kubenswrapper[13046]: I0308 03:27:06.622219 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:06.622348 master-0 kubenswrapper[13046]: I0308 03:27:06.622272 13046 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:07.378522 master-0 kubenswrapper[13046]: I0308 03:27:07.378442 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 08 03:27:07.379776 master-0 kubenswrapper[13046]: I0308 03:27:07.379693 13046 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" exitCode=0 Mar 08 03:27:07.381116 master-0 kubenswrapper[13046]: I0308 03:27:07.380962 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:07.381743 master-0 kubenswrapper[13046]: I0308 03:27:07.381665 13046 scope.go:117] "RemoveContainer" containerID="d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62" Mar 08 03:27:07.399226 master-0 kubenswrapper[13046]: I0308 03:27:07.399195 13046 scope.go:117] "RemoveContainer" containerID="c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b" Mar 08 03:27:07.406362 master-0 kubenswrapper[13046]: I0308 03:27:07.406294 13046 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:07.407233 master-0 kubenswrapper[13046]: I0308 03:27:07.407168 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:07.424116 master-0 kubenswrapper[13046]: I0308 03:27:07.424074 13046 scope.go:117] "RemoveContainer" containerID="b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b" Mar 08 03:27:07.444787 master-0 kubenswrapper[13046]: I0308 03:27:07.444737 13046 scope.go:117] "RemoveContainer" containerID="893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4" Mar 08 03:27:07.462072 master-0 kubenswrapper[13046]: I0308 03:27:07.461990 13046 scope.go:117] "RemoveContainer" containerID="7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" Mar 08 03:27:07.482165 master-0 kubenswrapper[13046]: I0308 03:27:07.482039 13046 scope.go:117] "RemoveContainer" containerID="ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9" Mar 08 03:27:07.505114 master-0 kubenswrapper[13046]: I0308 03:27:07.504875 13046 scope.go:117] "RemoveContainer" containerID="d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62" Mar 08 03:27:07.505568 master-0 kubenswrapper[13046]: E0308 03:27:07.505468 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62\": container with ID starting with d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62 not found: ID does not exist" containerID="d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62" Mar 08 03:27:07.505640 master-0 kubenswrapper[13046]: I0308 03:27:07.505567 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62"} err="failed to get container status \"d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62\": rpc error: code = NotFound desc = could not find container \"d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62\": container with ID starting with d58e407d63cda35ee6801e1e287ae05cc2871eb0da341a7c31ee6d7e31b71c62 not found: ID does not exist" Mar 08 03:27:07.505640 master-0 kubenswrapper[13046]: I0308 03:27:07.505589 13046 scope.go:117] "RemoveContainer" containerID="c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b" Mar 08 03:27:07.506043 master-0 kubenswrapper[13046]: E0308 03:27:07.505991 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b\": container with ID starting with c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b not found: ID does not exist" containerID="c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b" Mar 08 03:27:07.506043 master-0 kubenswrapper[13046]: I0308 03:27:07.506035 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b"} err="failed to get container status \"c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b\": rpc error: code = NotFound desc = could not find container \"c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b\": container with ID starting with c2cb83a2d211d613f22c3b1fd4eb371d6422ba8140e411f2a5ddfaebf6efed7b not found: ID does not exist" Mar 08 03:27:07.506043 master-0 kubenswrapper[13046]: I0308 03:27:07.506050 13046 scope.go:117] "RemoveContainer" containerID="b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b" Mar 08 03:27:07.506411 master-0 kubenswrapper[13046]: E0308 03:27:07.506377 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b\": container with ID starting with b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b not found: ID does not exist" containerID="b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b" Mar 08 03:27:07.506461 master-0 kubenswrapper[13046]: I0308 03:27:07.506421 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b"} err="failed to get container status \"b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b\": rpc error: code = NotFound desc = could not find container \"b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b\": container with ID starting with b936839b0dde4b12f7f5ec482102605813336abecccfa56b632f555d77961a4b not found: ID does not exist" Mar 08 03:27:07.506461 master-0 kubenswrapper[13046]: I0308 03:27:07.506438 13046 scope.go:117] "RemoveContainer" containerID="893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4" Mar 08 03:27:07.506703 master-0 kubenswrapper[13046]: E0308 03:27:07.506668 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4\": container with ID starting with 893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4 not found: ID does not exist" containerID="893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4" Mar 08 03:27:07.506758 master-0 kubenswrapper[13046]: I0308 03:27:07.506694 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4"} err="failed to get container status \"893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4\": rpc error: code = NotFound desc = could not find container \"893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4\": container with ID starting with 893d4c2f139e70dd4df4d544a39bda63b0dc41cac3b22cd1acb4c35744eb66a4 not found: ID does not exist" Mar 08 03:27:07.506758 master-0 kubenswrapper[13046]: I0308 03:27:07.506737 13046 scope.go:117] "RemoveContainer" containerID="7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" Mar 08 03:27:07.507071 master-0 kubenswrapper[13046]: E0308 03:27:07.507032 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59\": container with ID starting with 7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59 not found: ID does not exist" containerID="7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59" Mar 08 03:27:07.507120 master-0 kubenswrapper[13046]: I0308 03:27:07.507066 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59"} err="failed to get container status \"7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59\": rpc error: code = NotFound desc = could not find container \"7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59\": container with ID starting with 7cfcc2dcfd8cd7fa97cf1b2babfae8dc6754b0fcfc038f61c26342af616fab59 not found: ID does not exist" Mar 08 03:27:07.507120 master-0 kubenswrapper[13046]: I0308 03:27:07.507093 13046 scope.go:117] "RemoveContainer" containerID="ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9" Mar 08 03:27:07.507644 master-0 kubenswrapper[13046]: E0308 03:27:07.507578 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9\": container with ID starting with ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9 not found: ID does not exist" containerID="ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9" Mar 08 03:27:07.507644 master-0 kubenswrapper[13046]: I0308 03:27:07.507635 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9"} err="failed to get container status \"ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9\": rpc error: code = NotFound desc = could not find container \"ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9\": container with ID starting with ac758fcd667da08dc61aec7cbf9dde7a006ee8984ef30ab25b9c5df3b76c0db9 not found: ID does not exist" Mar 08 03:27:08.125842 master-0 kubenswrapper[13046]: I0308 03:27:08.125771 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:08.127649 master-0 kubenswrapper[13046]: I0308 03:27:08.127559 13046 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:08.134635 master-0 kubenswrapper[13046]: I0308 03:27:08.134596 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dd10388b9e3e48a07382126e86621" path="/var/lib/kubelet/pods/077dd10388b9e3e48a07382126e86621/volumes" Mar 08 03:27:09.124567 master-0 kubenswrapper[13046]: E0308 03:27:09.124344 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.126083 master-0 kubenswrapper[13046]: E0308 03:27:09.125841 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.126582 master-0 kubenswrapper[13046]: E0308 03:27:09.126539 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.127432 master-0 kubenswrapper[13046]: E0308 03:27:09.127381 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.128277 master-0 kubenswrapper[13046]: E0308 03:27:09.128240 13046 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.128368 master-0 kubenswrapper[13046]: I0308 03:27:09.128280 13046 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 03:27:09.129360 master-0 kubenswrapper[13046]: E0308 03:27:09.129252 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 08 03:27:09.331166 master-0 kubenswrapper[13046]: E0308 03:27:09.331008 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 08 03:27:09.407937 master-0 kubenswrapper[13046]: I0308 03:27:09.407814 13046 generic.go:334] "Generic (PLEG): container finished" podID="8517691b-937c-4cde-a7d2-fe18d6b7193d" containerID="3899d101e408c7a23497b90981b3128c53aac06bc31d8fc4ebd6216130e8cdcd" exitCode=0 Mar 08 03:27:09.407937 master-0 kubenswrapper[13046]: I0308 03:27:09.407877 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"8517691b-937c-4cde-a7d2-fe18d6b7193d","Type":"ContainerDied","Data":"3899d101e408c7a23497b90981b3128c53aac06bc31d8fc4ebd6216130e8cdcd"} Mar 08 03:27:09.408658 master-0 kubenswrapper[13046]: I0308 03:27:09.408614 13046 status_manager.go:851] "Failed to get status for pod" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.409101 master-0 kubenswrapper[13046]: I0308 03:27:09.409055 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:09.731932 master-0 kubenswrapper[13046]: E0308 03:27:09.731852 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 08 03:27:10.534049 master-0 kubenswrapper[13046]: E0308 03:27:10.533982 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 08 03:27:10.872585 master-0 kubenswrapper[13046]: I0308 03:27:10.872527 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:27:10.873762 master-0 kubenswrapper[13046]: I0308 03:27:10.873711 13046 status_manager.go:851] "Failed to get status for pod" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:10.874876 master-0 kubenswrapper[13046]: I0308 03:27:10.874832 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:10.993525 master-0 kubenswrapper[13046]: I0308 03:27:10.993419 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock\") pod \"8517691b-937c-4cde-a7d2-fe18d6b7193d\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " Mar 08 03:27:10.993770 master-0 kubenswrapper[13046]: I0308 03:27:10.993647 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access\") pod \"8517691b-937c-4cde-a7d2-fe18d6b7193d\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " Mar 08 03:27:10.993770 master-0 kubenswrapper[13046]: I0308 03:27:10.993689 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir\") pod \"8517691b-937c-4cde-a7d2-fe18d6b7193d\" (UID: \"8517691b-937c-4cde-a7d2-fe18d6b7193d\") " Mar 08 03:27:10.994065 master-0 kubenswrapper[13046]: I0308 03:27:10.994030 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock" (OuterVolumeSpecName: "var-lock") pod "8517691b-937c-4cde-a7d2-fe18d6b7193d" (UID: "8517691b-937c-4cde-a7d2-fe18d6b7193d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:10.994269 master-0 kubenswrapper[13046]: I0308 03:27:10.994219 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8517691b-937c-4cde-a7d2-fe18d6b7193d" (UID: "8517691b-937c-4cde-a7d2-fe18d6b7193d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:10.994481 master-0 kubenswrapper[13046]: I0308 03:27:10.994452 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:10.998650 master-0 kubenswrapper[13046]: I0308 03:27:10.998575 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8517691b-937c-4cde-a7d2-fe18d6b7193d" (UID: "8517691b-937c-4cde-a7d2-fe18d6b7193d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:27:11.096678 master-0 kubenswrapper[13046]: I0308 03:27:11.096548 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8517691b-937c-4cde-a7d2-fe18d6b7193d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:11.096678 master-0 kubenswrapper[13046]: I0308 03:27:11.096648 13046 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8517691b-937c-4cde-a7d2-fe18d6b7193d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:11.431840 master-0 kubenswrapper[13046]: I0308 03:27:11.431674 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"8517691b-937c-4cde-a7d2-fe18d6b7193d","Type":"ContainerDied","Data":"37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed"} Mar 08 03:27:11.431840 master-0 kubenswrapper[13046]: I0308 03:27:11.431752 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 08 03:27:11.432140 master-0 kubenswrapper[13046]: I0308 03:27:11.431766 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fb8f6f5abcb97cff6bcd1ccf5844d7dd93bae9a05962636f1087066edf6fed" Mar 08 03:27:11.473560 master-0 kubenswrapper[13046]: I0308 03:27:11.473436 13046 status_manager.go:851] "Failed to get status for pod" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:11.474804 master-0 kubenswrapper[13046]: I0308 03:27:11.474680 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:11.664774 master-0 kubenswrapper[13046]: E0308 03:27:11.664432 13046 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{alertmanager-main-0.189abfec282b9b47 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:alertmanager-main-0,UID:eb45ce29-bddc-46ac-a337-0ca37b46714e,APIVersion:v1,ResourceVersion:16085,FieldPath:spec.containers{alertmanager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e43499c79a8b5d642b3376af9595daaf45f91b3f616c93b24155f0d47003963\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-08 03:27:04.330844999 +0000 UTC m=+826.409612226,LastTimestamp:2026-03-08 03:27:04.330844999 +0000 UTC m=+826.409612226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 08 03:27:12.135685 master-0 kubenswrapper[13046]: E0308 03:27:12.135569 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 08 03:27:13.420896 master-0 kubenswrapper[13046]: I0308 03:27:13.420817 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:27:13.423268 master-0 kubenswrapper[13046]: I0308 03:27:13.420909 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:27:13.514280 master-0 kubenswrapper[13046]: I0308 03:27:13.514203 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:27:13.514644 master-0 kubenswrapper[13046]: I0308 03:27:13.514285 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:27:15.117806 master-0 kubenswrapper[13046]: I0308 03:27:15.117750 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:15.120026 master-0 kubenswrapper[13046]: I0308 03:27:15.119956 13046 status_manager.go:851] "Failed to get status for pod" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.121691 master-0 kubenswrapper[13046]: I0308 03:27:15.121605 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.154841 master-0 kubenswrapper[13046]: I0308 03:27:15.154783 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:15.154841 master-0 kubenswrapper[13046]: I0308 03:27:15.154823 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:15.155873 master-0 kubenswrapper[13046]: E0308 03:27:15.155803 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:15.156685 master-0 kubenswrapper[13046]: I0308 03:27:15.156636 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:15.337717 master-0 kubenswrapper[13046]: E0308 03:27:15.337634 13046 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 08 03:27:15.470383 master-0 kubenswrapper[13046]: I0308 03:27:15.470306 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"6a4cb13c67587efda812f521861b593570e4f01e554185d0d76b68fdba8e5c6d"} Mar 08 03:27:15.937082 master-0 kubenswrapper[13046]: E0308 03:27:15.936982 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:15Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:15Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:15Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T03:27:15Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.938376 master-0 kubenswrapper[13046]: E0308 03:27:15.938309 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.939286 master-0 kubenswrapper[13046]: E0308 03:27:15.939242 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.939994 master-0 kubenswrapper[13046]: E0308 03:27:15.939951 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.941957 master-0 kubenswrapper[13046]: E0308 03:27:15.941915 13046 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:15.941957 master-0 kubenswrapper[13046]: E0308 03:27:15.941949 13046 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 03:27:16.482782 master-0 kubenswrapper[13046]: I0308 03:27:16.482694 13046 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="683836e41f92176912619eb6417e5ca89ed583121b54209e191375b1cb98561e" exitCode=0 Mar 08 03:27:16.482782 master-0 kubenswrapper[13046]: I0308 03:27:16.482779 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"683836e41f92176912619eb6417e5ca89ed583121b54209e191375b1cb98561e"} Mar 08 03:27:16.483788 master-0 kubenswrapper[13046]: I0308 03:27:16.483665 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:16.483788 master-0 kubenswrapper[13046]: I0308 03:27:16.483751 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:16.484601 master-0 kubenswrapper[13046]: I0308 03:27:16.484547 13046 status_manager.go:851] "Failed to get status for pod" podUID="eb45ce29-bddc-46ac-a337-0ca37b46714e" pod="openshift-monitoring/alertmanager-main-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/alertmanager-main-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:16.484884 master-0 kubenswrapper[13046]: E0308 03:27:16.484801 13046 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:16.486018 master-0 kubenswrapper[13046]: I0308 03:27:16.485734 13046 status_manager.go:851] "Failed to get status for pod" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 08 03:27:17.494114 master-0 kubenswrapper[13046]: I0308 03:27:17.494060 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_7c77ceff52dd4a01c709016eef561173/kube-controller-manager/0.log" Mar 08 03:27:17.494475 master-0 kubenswrapper[13046]: I0308 03:27:17.494127 13046 generic.go:334] "Generic (PLEG): container finished" podID="7c77ceff52dd4a01c709016eef561173" containerID="444586abdf1eb02151f4f0d8b659d3442ea238eb9dcd40642fd48ac22b982d0e" exitCode=1 Mar 08 03:27:17.494475 master-0 kubenswrapper[13046]: I0308 03:27:17.494186 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerDied","Data":"444586abdf1eb02151f4f0d8b659d3442ea238eb9dcd40642fd48ac22b982d0e"} Mar 08 03:27:17.494794 master-0 kubenswrapper[13046]: I0308 03:27:17.494758 13046 scope.go:117] "RemoveContainer" containerID="444586abdf1eb02151f4f0d8b659d3442ea238eb9dcd40642fd48ac22b982d0e" Mar 08 03:27:17.498780 master-0 kubenswrapper[13046]: I0308 03:27:17.498741 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"ce8f427b8bf34ec53df435c60b07523ab5967605a19bfad31e9b12766825b4b6"} Mar 08 03:27:17.498780 master-0 kubenswrapper[13046]: I0308 03:27:17.498775 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"9f0d21fb91fc4fd0ea0964241a4c87139cbec4f45ceccefbc4631309fa6444b4"} Mar 08 03:27:17.498780 master-0 kubenswrapper[13046]: I0308 03:27:17.498785 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"6a33281165537aeef328d69c6a24ccbd0c5ea48d1c2e24ffab9dd994662f052e"} Mar 08 03:27:18.510154 master-0 kubenswrapper[13046]: I0308 03:27:18.510085 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"2ef103dc7a8b8350771a738faee46c1b0b6a4c5e282113f0ad2646f21b84ba5d"} Mar 08 03:27:18.510154 master-0 kubenswrapper[13046]: I0308 03:27:18.510130 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"87f6b455974537e41e8286c21c6338dab8837fafb38bede878aaff183d9c01de"} Mar 08 03:27:18.510680 master-0 kubenswrapper[13046]: I0308 03:27:18.510622 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:18.511118 master-0 kubenswrapper[13046]: I0308 03:27:18.511039 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:18.511499 master-0 kubenswrapper[13046]: I0308 03:27:18.511449 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:18.513347 master-0 kubenswrapper[13046]: I0308 03:27:18.513309 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_7c77ceff52dd4a01c709016eef561173/kube-controller-manager/0.log" Mar 08 03:27:18.513413 master-0 kubenswrapper[13046]: I0308 03:27:18.513388 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7c77ceff52dd4a01c709016eef561173","Type":"ContainerStarted","Data":"52cb14c6bcb9cebab7d168696b3b6bcd0449b0f4c1910a69529277450917ec81"} Mar 08 03:27:20.160726 master-0 kubenswrapper[13046]: I0308 03:27:20.157787 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:20.160726 master-0 kubenswrapper[13046]: I0308 03:27:20.158108 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:20.170894 master-0 kubenswrapper[13046]: I0308 03:27:20.170833 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:22.192347 master-0 kubenswrapper[13046]: I0308 03:27:22.192238 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:27:22.193400 master-0 kubenswrapper[13046]: I0308 03:27:22.192861 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:27:22.204964 master-0 kubenswrapper[13046]: I0308 03:27:22.204405 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:27:23.420601 master-0 kubenswrapper[13046]: I0308 03:27:23.420554 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:27:23.421044 master-0 kubenswrapper[13046]: I0308 03:27:23.420602 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:27:23.513827 master-0 kubenswrapper[13046]: I0308 03:27:23.513793 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:27:23.514077 master-0 kubenswrapper[13046]: I0308 03:27:23.514051 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:27:23.524059 master-0 kubenswrapper[13046]: I0308 03:27:23.524027 13046 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:23.551820 master-0 kubenswrapper[13046]: I0308 03:27:23.551785 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:23.552022 master-0 kubenswrapper[13046]: I0308 03:27:23.552010 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:23.555999 master-0 kubenswrapper[13046]: I0308 03:27:23.555983 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:23.558217 master-0 kubenswrapper[13046]: I0308 03:27:23.558164 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="80282202-3305-4e26-ad5c-deb34cad07da" Mar 08 03:27:24.562702 master-0 kubenswrapper[13046]: I0308 03:27:24.562640 13046 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:24.562702 master-0 kubenswrapper[13046]: I0308 03:27:24.562687 13046 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="6813ddce-840d-4a92-991c-6fc6204282fc" Mar 08 03:27:28.159073 master-0 kubenswrapper[13046]: I0308 03:27:28.158952 13046 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="80282202-3305-4e26-ad5c-deb34cad07da" Mar 08 03:27:31.799112 master-0 kubenswrapper[13046]: I0308 03:27:31.799040 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 03:27:32.203557 master-0 kubenswrapper[13046]: I0308 03:27:32.203510 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 08 03:27:33.422741 master-0 kubenswrapper[13046]: I0308 03:27:33.421823 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Mar 08 03:27:33.422741 master-0 kubenswrapper[13046]: I0308 03:27:33.421933 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Mar 08 03:27:33.514533 master-0 kubenswrapper[13046]: I0308 03:27:33.513884 13046 patch_prober.go:28] interesting pod/console-668dfc897d-db2r2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" start-of-body= Mar 08 03:27:33.514533 master-0 kubenswrapper[13046]: I0308 03:27:33.513968 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" probeResult="failure" output="Get \"https://10.128.0.98:8443/health\": dial tcp 10.128.0.98:8443: connect: connection refused" Mar 08 03:27:33.925723 master-0 kubenswrapper[13046]: I0308 03:27:33.925627 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 03:27:34.199935 master-0 kubenswrapper[13046]: I0308 03:27:34.199548 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 03:27:34.409014 master-0 kubenswrapper[13046]: I0308 03:27:34.408962 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-69f8r" Mar 08 03:27:34.680317 master-0 kubenswrapper[13046]: I0308 03:27:34.680246 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-g6bk7" Mar 08 03:27:34.710819 master-0 kubenswrapper[13046]: I0308 03:27:34.710736 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 03:27:34.823182 master-0 kubenswrapper[13046]: I0308 03:27:34.823065 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-lwkgm" Mar 08 03:27:34.961159 master-0 kubenswrapper[13046]: I0308 03:27:34.960974 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 08 03:27:35.618856 master-0 kubenswrapper[13046]: I0308 03:27:35.618786 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 08 03:27:35.763166 master-0 kubenswrapper[13046]: I0308 03:27:35.763077 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 03:27:35.873523 master-0 kubenswrapper[13046]: I0308 03:27:35.873329 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 08 03:27:35.982018 master-0 kubenswrapper[13046]: I0308 03:27:35.981918 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 03:27:36.111504 master-0 kubenswrapper[13046]: I0308 03:27:36.111403 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 08 03:27:36.155746 master-0 kubenswrapper[13046]: I0308 03:27:36.155604 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 03:27:36.282314 master-0 kubenswrapper[13046]: I0308 03:27:36.282244 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 03:27:36.283981 master-0 kubenswrapper[13046]: I0308 03:27:36.283931 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-aj3ujg6d4dtk9" Mar 08 03:27:36.315892 master-0 kubenswrapper[13046]: I0308 03:27:36.315828 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 03:27:36.404400 master-0 kubenswrapper[13046]: I0308 03:27:36.404352 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 08 03:27:36.414257 master-0 kubenswrapper[13046]: I0308 03:27:36.414121 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 03:27:36.414888 master-0 kubenswrapper[13046]: I0308 03:27:36.414822 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 03:27:36.552791 master-0 kubenswrapper[13046]: I0308 03:27:36.552696 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 03:27:36.790098 master-0 kubenswrapper[13046]: I0308 03:27:36.789232 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 03:27:36.910968 master-0 kubenswrapper[13046]: I0308 03:27:36.910869 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 08 03:27:37.050841 master-0 kubenswrapper[13046]: I0308 03:27:37.050639 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 03:27:37.189839 master-0 kubenswrapper[13046]: I0308 03:27:37.189790 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-r4dpg" Mar 08 03:27:37.225455 master-0 kubenswrapper[13046]: I0308 03:27:37.225404 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 03:27:37.226685 master-0 kubenswrapper[13046]: I0308 03:27:37.226653 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l8646" Mar 08 03:27:37.386233 master-0 kubenswrapper[13046]: I0308 03:27:37.386167 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 03:27:37.491318 master-0 kubenswrapper[13046]: I0308 03:27:37.491272 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 03:27:37.536199 master-0 kubenswrapper[13046]: I0308 03:27:37.536086 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 03:27:37.735635 master-0 kubenswrapper[13046]: I0308 03:27:37.734082 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 03:27:37.749132 master-0 kubenswrapper[13046]: I0308 03:27:37.749068 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 08 03:27:37.763731 master-0 kubenswrapper[13046]: I0308 03:27:37.763676 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 08 03:27:37.772427 master-0 kubenswrapper[13046]: I0308 03:27:37.772354 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 03:27:37.778173 master-0 kubenswrapper[13046]: I0308 03:27:37.777888 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 03:27:38.030092 master-0 kubenswrapper[13046]: I0308 03:27:38.029979 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 03:27:38.090144 master-0 kubenswrapper[13046]: I0308 03:27:38.090083 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 08 03:27:38.133785 master-0 kubenswrapper[13046]: I0308 03:27:38.133722 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 08 03:27:38.201078 master-0 kubenswrapper[13046]: I0308 03:27:38.201012 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 03:27:38.207891 master-0 kubenswrapper[13046]: I0308 03:27:38.207833 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 03:27:38.303466 master-0 kubenswrapper[13046]: I0308 03:27:38.302663 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 03:27:38.313937 master-0 kubenswrapper[13046]: I0308 03:27:38.313863 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 03:27:38.346386 master-0 kubenswrapper[13046]: I0308 03:27:38.346315 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 03:27:38.396200 master-0 kubenswrapper[13046]: I0308 03:27:38.396021 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 03:27:38.407822 master-0 kubenswrapper[13046]: I0308 03:27:38.407788 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 03:27:38.453138 master-0 kubenswrapper[13046]: I0308 03:27:38.453079 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 03:27:38.513598 master-0 kubenswrapper[13046]: I0308 03:27:38.511408 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 03:27:38.623283 master-0 kubenswrapper[13046]: I0308 03:27:38.621991 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 08 03:27:38.630325 master-0 kubenswrapper[13046]: I0308 03:27:38.630271 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 03:27:38.706544 master-0 kubenswrapper[13046]: I0308 03:27:38.706425 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 03:27:38.715631 master-0 kubenswrapper[13046]: I0308 03:27:38.715585 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 03:27:38.732100 master-0 kubenswrapper[13046]: I0308 03:27:38.732042 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 03:27:38.738792 master-0 kubenswrapper[13046]: I0308 03:27:38.738737 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 03:27:38.772741 master-0 kubenswrapper[13046]: I0308 03:27:38.772697 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-twhrj" Mar 08 03:27:38.815410 master-0 kubenswrapper[13046]: I0308 03:27:38.815321 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-4dw5m" Mar 08 03:27:38.872295 master-0 kubenswrapper[13046]: I0308 03:27:38.872203 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 08 03:27:38.902908 master-0 kubenswrapper[13046]: I0308 03:27:38.902742 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 08 03:27:38.943691 master-0 kubenswrapper[13046]: I0308 03:27:38.943609 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 03:27:38.987298 master-0 kubenswrapper[13046]: I0308 03:27:38.987236 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 08 03:27:39.074180 master-0 kubenswrapper[13046]: I0308 03:27:39.074028 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 03:27:39.113534 master-0 kubenswrapper[13046]: I0308 03:27:39.113436 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:27:39.296694 master-0 kubenswrapper[13046]: I0308 03:27:39.296587 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 03:27:39.347287 master-0 kubenswrapper[13046]: I0308 03:27:39.347230 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 03:27:39.348383 master-0 kubenswrapper[13046]: I0308 03:27:39.348358 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 03:27:39.481145 master-0 kubenswrapper[13046]: I0308 03:27:39.481074 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 08 03:27:39.508246 master-0 kubenswrapper[13046]: I0308 03:27:39.508133 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 08 03:27:39.557627 master-0 kubenswrapper[13046]: I0308 03:27:39.557407 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 08 03:27:39.643300 master-0 kubenswrapper[13046]: I0308 03:27:39.643245 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-xfphx" Mar 08 03:27:39.649902 master-0 kubenswrapper[13046]: I0308 03:27:39.649839 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 03:27:39.692794 master-0 kubenswrapper[13046]: I0308 03:27:39.692664 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 03:27:39.694690 master-0 kubenswrapper[13046]: I0308 03:27:39.694638 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 03:27:39.750679 master-0 kubenswrapper[13046]: I0308 03:27:39.750607 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 03:27:39.779683 master-0 kubenswrapper[13046]: I0308 03:27:39.779631 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 08 03:27:39.880336 master-0 kubenswrapper[13046]: I0308 03:27:39.880266 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 03:27:39.942570 master-0 kubenswrapper[13046]: I0308 03:27:39.942514 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 03:27:40.133076 master-0 kubenswrapper[13046]: I0308 03:27:40.132637 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 03:27:40.149701 master-0 kubenswrapper[13046]: I0308 03:27:40.149602 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 08 03:27:40.218833 master-0 kubenswrapper[13046]: I0308 03:27:40.218768 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 03:27:40.262958 master-0 kubenswrapper[13046]: I0308 03:27:40.262604 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 08 03:27:40.291879 master-0 kubenswrapper[13046]: I0308 03:27:40.291793 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 08 03:27:40.318384 master-0 kubenswrapper[13046]: I0308 03:27:40.318083 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 08 03:27:40.331997 master-0 kubenswrapper[13046]: I0308 03:27:40.331930 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 03:27:40.354763 master-0 kubenswrapper[13046]: I0308 03:27:40.354550 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-chsmd" Mar 08 03:27:40.428088 master-0 kubenswrapper[13046]: I0308 03:27:40.427969 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 08 03:27:40.449325 master-0 kubenswrapper[13046]: I0308 03:27:40.449267 13046 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 03:27:40.457214 master-0 kubenswrapper[13046]: I0308 03:27:40.457155 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-29dgn" Mar 08 03:27:40.496934 master-0 kubenswrapper[13046]: I0308 03:27:40.496831 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 03:27:40.575917 master-0 kubenswrapper[13046]: I0308 03:27:40.575834 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 03:27:40.595616 master-0 kubenswrapper[13046]: I0308 03:27:40.595530 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 03:27:40.607168 master-0 kubenswrapper[13046]: I0308 03:27:40.607087 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 03:27:40.652026 master-0 kubenswrapper[13046]: I0308 03:27:40.651954 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 08 03:27:40.660593 master-0 kubenswrapper[13046]: I0308 03:27:40.660545 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 03:27:40.669438 master-0 kubenswrapper[13046]: I0308 03:27:40.669385 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 03:27:40.740280 master-0 kubenswrapper[13046]: I0308 03:27:40.740164 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 08 03:27:40.756656 master-0 kubenswrapper[13046]: I0308 03:27:40.754614 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 08 03:27:40.800838 master-0 kubenswrapper[13046]: I0308 03:27:40.800800 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 03:27:40.802315 master-0 kubenswrapper[13046]: I0308 03:27:40.802276 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 08 03:27:40.829372 master-0 kubenswrapper[13046]: I0308 03:27:40.829310 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 08 03:27:40.844438 master-0 kubenswrapper[13046]: I0308 03:27:40.844393 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 03:27:40.845297 master-0 kubenswrapper[13046]: I0308 03:27:40.845267 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 08 03:27:40.856280 master-0 kubenswrapper[13046]: I0308 03:27:40.856243 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 03:27:40.905496 master-0 kubenswrapper[13046]: I0308 03:27:40.905414 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 03:27:40.934122 master-0 kubenswrapper[13046]: I0308 03:27:40.934063 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 03:27:40.935060 master-0 kubenswrapper[13046]: I0308 03:27:40.935019 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 08 03:27:40.960677 master-0 kubenswrapper[13046]: I0308 03:27:40.960623 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 03:27:40.972247 master-0 kubenswrapper[13046]: I0308 03:27:40.972205 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 03:27:40.992977 master-0 kubenswrapper[13046]: I0308 03:27:40.992885 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:27:41.072356 master-0 kubenswrapper[13046]: I0308 03:27:41.072300 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 03:27:41.119982 master-0 kubenswrapper[13046]: I0308 03:27:41.119940 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:27:41.155275 master-0 kubenswrapper[13046]: I0308 03:27:41.155228 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 08 03:27:41.204714 master-0 kubenswrapper[13046]: I0308 03:27:41.204669 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 03:27:41.233406 master-0 kubenswrapper[13046]: I0308 03:27:41.233356 13046 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 03:27:41.234229 master-0 kubenswrapper[13046]: I0308 03:27:41.234183 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=39.234167685 podStartE2EDuration="39.234167685s" podCreationTimestamp="2026-03-08 03:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:27:23.385870525 +0000 UTC m=+845.464637742" watchObservedRunningTime="2026-03-08 03:27:41.234167685 +0000 UTC m=+863.312934912" Mar 08 03:27:41.240550 master-0 kubenswrapper[13046]: I0308 03:27:41.240513 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:27:41.240610 master-0 kubenswrapper[13046]: I0308 03:27:41.240566 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 08 03:27:41.245356 master-0 kubenswrapper[13046]: I0308 03:27:41.245284 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 08 03:27:41.246745 master-0 kubenswrapper[13046]: I0308 03:27:41.246718 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:27:41.248543 master-0 kubenswrapper[13046]: I0308 03:27:41.248463 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 08 03:27:41.267360 master-0 kubenswrapper[13046]: I0308 03:27:41.267263 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.267239828 podStartE2EDuration="18.267239828s" podCreationTimestamp="2026-03-08 03:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:27:41.261753365 +0000 UTC m=+863.340520622" watchObservedRunningTime="2026-03-08 03:27:41.267239828 +0000 UTC m=+863.346007045" Mar 08 03:27:41.350153 master-0 kubenswrapper[13046]: I0308 03:27:41.350100 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 03:27:41.448708 master-0 kubenswrapper[13046]: I0308 03:27:41.448655 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 08 03:27:41.487192 master-0 kubenswrapper[13046]: I0308 03:27:41.487122 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 03:27:41.498623 master-0 kubenswrapper[13046]: I0308 03:27:41.498452 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 03:27:41.657757 master-0 kubenswrapper[13046]: I0308 03:27:41.657684 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 03:27:41.660836 master-0 kubenswrapper[13046]: I0308 03:27:41.660777 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 03:27:41.675598 master-0 kubenswrapper[13046]: I0308 03:27:41.675558 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 03:27:41.786370 master-0 kubenswrapper[13046]: I0308 03:27:41.786217 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 03:27:41.862438 master-0 kubenswrapper[13046]: I0308 03:27:41.862383 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 03:27:41.888472 master-0 kubenswrapper[13046]: I0308 03:27:41.888438 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 03:27:41.909016 master-0 kubenswrapper[13046]: I0308 03:27:41.908945 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 03:27:41.926825 master-0 kubenswrapper[13046]: I0308 03:27:41.926760 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-h99p2" Mar 08 03:27:41.928308 master-0 kubenswrapper[13046]: I0308 03:27:41.928268 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 03:27:41.958803 master-0 kubenswrapper[13046]: I0308 03:27:41.958750 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 03:27:42.005607 master-0 kubenswrapper[13046]: I0308 03:27:42.005550 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 03:27:42.014046 master-0 kubenswrapper[13046]: I0308 03:27:42.014006 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 08 03:27:42.015957 master-0 kubenswrapper[13046]: I0308 03:27:42.015935 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 03:27:42.031303 master-0 kubenswrapper[13046]: I0308 03:27:42.031253 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 03:27:42.032135 master-0 kubenswrapper[13046]: I0308 03:27:42.032092 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-9xbqs" Mar 08 03:27:42.186122 master-0 kubenswrapper[13046]: I0308 03:27:42.186070 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 03:27:42.199265 master-0 kubenswrapper[13046]: I0308 03:27:42.199212 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 08 03:27:42.215544 master-0 kubenswrapper[13046]: I0308 03:27:42.215506 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 08 03:27:42.311617 master-0 kubenswrapper[13046]: I0308 03:27:42.311565 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 03:27:42.336718 master-0 kubenswrapper[13046]: I0308 03:27:42.336648 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 03:27:42.383673 master-0 kubenswrapper[13046]: I0308 03:27:42.383600 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 08 03:27:42.420460 master-0 kubenswrapper[13046]: I0308 03:27:42.420341 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 03:27:42.424178 master-0 kubenswrapper[13046]: I0308 03:27:42.424132 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-7kngk" Mar 08 03:27:42.475927 master-0 kubenswrapper[13046]: I0308 03:27:42.475803 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-c72nd" Mar 08 03:27:42.545795 master-0 kubenswrapper[13046]: I0308 03:27:42.545747 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 08 03:27:42.577065 master-0 kubenswrapper[13046]: I0308 03:27:42.577020 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 03:27:42.598595 master-0 kubenswrapper[13046]: I0308 03:27:42.598526 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 03:27:42.625595 master-0 kubenswrapper[13046]: I0308 03:27:42.625540 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 03:27:42.704409 master-0 kubenswrapper[13046]: I0308 03:27:42.704356 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 08 03:27:42.742543 master-0 kubenswrapper[13046]: I0308 03:27:42.741740 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 08 03:27:42.756541 master-0 kubenswrapper[13046]: I0308 03:27:42.753973 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 03:27:42.890975 master-0 kubenswrapper[13046]: I0308 03:27:42.890734 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:27:42.966501 master-0 kubenswrapper[13046]: I0308 03:27:42.964957 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 08 03:27:42.966501 master-0 kubenswrapper[13046]: I0308 03:27:42.965852 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-wxpwp" Mar 08 03:27:43.009300 master-0 kubenswrapper[13046]: I0308 03:27:43.009197 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 03:27:43.012407 master-0 kubenswrapper[13046]: I0308 03:27:43.012378 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-rkgwg" Mar 08 03:27:43.058903 master-0 kubenswrapper[13046]: I0308 03:27:43.058854 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 03:27:43.065020 master-0 kubenswrapper[13046]: I0308 03:27:43.064971 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 08 03:27:43.065276 master-0 kubenswrapper[13046]: I0308 03:27:43.065234 13046 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 03:27:43.133519 master-0 kubenswrapper[13046]: I0308 03:27:43.133459 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 03:27:43.243397 master-0 kubenswrapper[13046]: I0308 03:27:43.243346 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 03:27:43.246931 master-0 kubenswrapper[13046]: I0308 03:27:43.246898 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 03:27:43.350928 master-0 kubenswrapper[13046]: I0308 03:27:43.350875 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 08 03:27:43.411776 master-0 kubenswrapper[13046]: I0308 03:27:43.411723 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 03:27:43.414145 master-0 kubenswrapper[13046]: I0308 03:27:43.414109 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 03:27:43.424335 master-0 kubenswrapper[13046]: I0308 03:27:43.424293 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 03:27:43.425206 master-0 kubenswrapper[13046]: I0308 03:27:43.425160 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:27:43.429244 master-0 kubenswrapper[13046]: I0308 03:27:43.429212 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:27:43.464257 master-0 kubenswrapper[13046]: I0308 03:27:43.464198 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 03:27:43.594507 master-0 kubenswrapper[13046]: I0308 03:27:43.594443 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 08 03:27:43.599849 master-0 kubenswrapper[13046]: I0308 03:27:43.599793 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-7ndsq" Mar 08 03:27:43.640697 master-0 kubenswrapper[13046]: I0308 03:27:43.640571 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-rdhz7" Mar 08 03:27:43.652842 master-0 kubenswrapper[13046]: I0308 03:27:43.652778 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 03:27:43.680456 master-0 kubenswrapper[13046]: I0308 03:27:43.680404 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 08 03:27:43.705275 master-0 kubenswrapper[13046]: I0308 03:27:43.705166 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 03:27:43.716514 master-0 kubenswrapper[13046]: I0308 03:27:43.716434 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 03:27:43.963741 master-0 kubenswrapper[13046]: I0308 03:27:43.963606 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 03:27:43.975204 master-0 kubenswrapper[13046]: I0308 03:27:43.975166 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 08 03:27:44.026754 master-0 kubenswrapper[13046]: I0308 03:27:44.026695 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 08 03:27:44.042893 master-0 kubenswrapper[13046]: I0308 03:27:44.042814 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-svw57" Mar 08 03:27:44.084988 master-0 kubenswrapper[13046]: I0308 03:27:44.084924 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 08 03:27:44.087344 master-0 kubenswrapper[13046]: I0308 03:27:44.087053 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 03:27:44.164229 master-0 kubenswrapper[13046]: I0308 03:27:44.164066 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 03:27:44.168079 master-0 kubenswrapper[13046]: I0308 03:27:44.168010 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 08 03:27:44.187973 master-0 kubenswrapper[13046]: I0308 03:27:44.187923 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 08 03:27:44.197216 master-0 kubenswrapper[13046]: I0308 03:27:44.197136 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 03:27:44.209885 master-0 kubenswrapper[13046]: I0308 03:27:44.209795 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 03:27:44.227758 master-0 kubenswrapper[13046]: I0308 03:27:44.227610 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 03:27:44.261172 master-0 kubenswrapper[13046]: I0308 03:27:44.261073 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 08 03:27:44.292313 master-0 kubenswrapper[13046]: I0308 03:27:44.292250 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 03:27:44.325176 master-0 kubenswrapper[13046]: I0308 03:27:44.325114 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 03:27:44.434584 master-0 kubenswrapper[13046]: I0308 03:27:44.434477 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 03:27:44.456914 master-0 kubenswrapper[13046]: I0308 03:27:44.456864 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 03:27:44.461901 master-0 kubenswrapper[13046]: I0308 03:27:44.461822 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 03:27:44.546063 master-0 kubenswrapper[13046]: I0308 03:27:44.545881 13046 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 03:27:44.579152 master-0 kubenswrapper[13046]: I0308 03:27:44.579052 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-5d9sz" Mar 08 03:27:44.613083 master-0 kubenswrapper[13046]: I0308 03:27:44.612959 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 03:27:44.614609 master-0 kubenswrapper[13046]: I0308 03:27:44.614478 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 08 03:27:44.617724 master-0 kubenswrapper[13046]: I0308 03:27:44.617664 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 03:27:44.621596 master-0 kubenswrapper[13046]: I0308 03:27:44.621550 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 08 03:27:44.628438 master-0 kubenswrapper[13046]: I0308 03:27:44.628385 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-dmv4m" Mar 08 03:27:44.640401 master-0 kubenswrapper[13046]: I0308 03:27:44.640354 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 03:27:44.669094 master-0 kubenswrapper[13046]: I0308 03:27:44.669022 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 03:27:44.721559 master-0 kubenswrapper[13046]: I0308 03:27:44.721518 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 03:27:44.768384 master-0 kubenswrapper[13046]: I0308 03:27:44.768349 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 03:27:44.797023 master-0 kubenswrapper[13046]: I0308 03:27:44.796904 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 03:27:44.868996 master-0 kubenswrapper[13046]: I0308 03:27:44.868910 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 08 03:27:44.890833 master-0 kubenswrapper[13046]: I0308 03:27:44.890741 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 03:27:44.941669 master-0 kubenswrapper[13046]: I0308 03:27:44.939708 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 03:27:44.946957 master-0 kubenswrapper[13046]: I0308 03:27:44.946903 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 08 03:27:44.973333 master-0 kubenswrapper[13046]: I0308 03:27:44.973271 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 03:27:44.983224 master-0 kubenswrapper[13046]: I0308 03:27:44.983173 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-hlcg2" Mar 08 03:27:45.038882 master-0 kubenswrapper[13046]: I0308 03:27:45.038817 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 03:27:45.050548 master-0 kubenswrapper[13046]: I0308 03:27:45.050447 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 03:27:45.069511 master-0 kubenswrapper[13046]: I0308 03:27:45.069446 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-tvw7c" Mar 08 03:27:45.095710 master-0 kubenswrapper[13046]: I0308 03:27:45.095640 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 03:27:45.261719 master-0 kubenswrapper[13046]: I0308 03:27:45.261659 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 08 03:27:45.277269 master-0 kubenswrapper[13046]: I0308 03:27:45.277219 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-xfc6f" Mar 08 03:27:45.441629 master-0 kubenswrapper[13046]: I0308 03:27:45.441569 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 03:27:45.469322 master-0 kubenswrapper[13046]: I0308 03:27:45.469267 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 03:27:45.471657 master-0 kubenswrapper[13046]: I0308 03:27:45.471604 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-9f7bj" Mar 08 03:27:45.486169 master-0 kubenswrapper[13046]: I0308 03:27:45.486113 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 03:27:45.545827 master-0 kubenswrapper[13046]: I0308 03:27:45.545757 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 08 03:27:45.561474 master-0 kubenswrapper[13046]: I0308 03:27:45.561422 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 03:27:45.565621 master-0 kubenswrapper[13046]: I0308 03:27:45.565569 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 03:27:45.585853 master-0 kubenswrapper[13046]: I0308 03:27:45.585783 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 08 03:27:45.602072 master-0 kubenswrapper[13046]: I0308 03:27:45.602012 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 03:27:45.632409 master-0 kubenswrapper[13046]: I0308 03:27:45.632351 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-5lx9s" Mar 08 03:27:45.680719 master-0 kubenswrapper[13046]: I0308 03:27:45.680654 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 08 03:27:45.717355 master-0 kubenswrapper[13046]: I0308 03:27:45.717209 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-atdhv16h3mv90" Mar 08 03:27:45.746467 master-0 kubenswrapper[13046]: I0308 03:27:45.746414 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 03:27:45.786753 master-0 kubenswrapper[13046]: I0308 03:27:45.786690 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 03:27:45.786753 master-0 kubenswrapper[13046]: I0308 03:27:45.786706 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 03:27:45.795634 master-0 kubenswrapper[13046]: I0308 03:27:45.795553 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-sbm8j" Mar 08 03:27:45.880024 master-0 kubenswrapper[13046]: I0308 03:27:45.879952 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 03:27:45.918571 master-0 kubenswrapper[13046]: I0308 03:27:45.918478 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 03:27:45.933011 master-0 kubenswrapper[13046]: I0308 03:27:45.932938 13046 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 03:27:45.962925 master-0 kubenswrapper[13046]: I0308 03:27:45.962844 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 03:27:45.971886 master-0 kubenswrapper[13046]: I0308 03:27:45.971787 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 03:27:46.016966 master-0 kubenswrapper[13046]: I0308 03:27:46.016893 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-hrrx2" Mar 08 03:27:46.027251 master-0 kubenswrapper[13046]: I0308 03:27:46.027183 13046 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 08 03:27:46.027509 master-0 kubenswrapper[13046]: I0308 03:27:46.027436 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" containerID="cri-o://9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3" gracePeriod=5 Mar 08 03:27:46.099021 master-0 kubenswrapper[13046]: I0308 03:27:46.098960 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 03:27:46.195765 master-0 kubenswrapper[13046]: I0308 03:27:46.195696 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-lr548" Mar 08 03:27:46.195765 master-0 kubenswrapper[13046]: I0308 03:27:46.195696 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 03:27:46.210136 master-0 kubenswrapper[13046]: I0308 03:27:46.210062 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 08 03:27:46.218993 master-0 kubenswrapper[13046]: I0308 03:27:46.218910 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 03:27:46.236739 master-0 kubenswrapper[13046]: I0308 03:27:46.236627 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 08 03:27:46.272718 master-0 kubenswrapper[13046]: I0308 03:27:46.272453 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 03:27:46.318141 master-0 kubenswrapper[13046]: I0308 03:27:46.318082 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 03:27:46.371321 master-0 kubenswrapper[13046]: I0308 03:27:46.371256 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 03:27:46.402085 master-0 kubenswrapper[13046]: I0308 03:27:46.402008 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 03:27:46.412779 master-0 kubenswrapper[13046]: I0308 03:27:46.412743 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 08 03:27:46.438342 master-0 kubenswrapper[13046]: I0308 03:27:46.438285 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 03:27:46.478886 master-0 kubenswrapper[13046]: I0308 03:27:46.478801 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 03:27:46.513346 master-0 kubenswrapper[13046]: I0308 03:27:46.513216 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 03:27:46.551645 master-0 kubenswrapper[13046]: I0308 03:27:46.551590 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 03:27:46.693694 master-0 kubenswrapper[13046]: I0308 03:27:46.693628 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 03:27:46.756458 master-0 kubenswrapper[13046]: I0308 03:27:46.756381 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 03:27:46.769525 master-0 kubenswrapper[13046]: I0308 03:27:46.769370 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 03:27:46.790336 master-0 kubenswrapper[13046]: I0308 03:27:46.790282 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 08 03:27:46.836697 master-0 kubenswrapper[13046]: I0308 03:27:46.836614 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 03:27:46.909408 master-0 kubenswrapper[13046]: I0308 03:27:46.908835 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 08 03:27:46.956154 master-0 kubenswrapper[13046]: I0308 03:27:46.956076 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 03:27:47.062872 master-0 kubenswrapper[13046]: I0308 03:27:47.062724 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 03:27:47.063416 master-0 kubenswrapper[13046]: I0308 03:27:47.063376 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 08 03:27:47.064750 master-0 kubenswrapper[13046]: I0308 03:27:47.064703 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 08 03:27:47.106430 master-0 kubenswrapper[13046]: I0308 03:27:47.106348 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 08 03:27:47.123639 master-0 kubenswrapper[13046]: I0308 03:27:47.123577 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 03:27:47.141728 master-0 kubenswrapper[13046]: I0308 03:27:47.141651 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 03:27:47.142005 master-0 kubenswrapper[13046]: I0308 03:27:47.141825 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:27:47.142261 master-0 kubenswrapper[13046]: E0308 03:27:47.142222 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" containerName="installer" Mar 08 03:27:47.142261 master-0 kubenswrapper[13046]: I0308 03:27:47.142254 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" containerName="installer" Mar 08 03:27:47.142457 master-0 kubenswrapper[13046]: E0308 03:27:47.142287 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 03:27:47.142457 master-0 kubenswrapper[13046]: I0308 03:27:47.142303 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 03:27:47.142641 master-0 kubenswrapper[13046]: I0308 03:27:47.142551 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 08 03:27:47.142641 master-0 kubenswrapper[13046]: I0308 03:27:47.142602 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="8517691b-937c-4cde-a7d2-fe18d6b7193d" containerName="installer" Mar 08 03:27:47.143372 master-0 kubenswrapper[13046]: I0308 03:27:47.143323 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197019 master-0 kubenswrapper[13046]: I0308 03:27:47.196950 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197324 master-0 kubenswrapper[13046]: I0308 03:27:47.197268 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197465 master-0 kubenswrapper[13046]: I0308 03:27:47.197372 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197465 master-0 kubenswrapper[13046]: I0308 03:27:47.197459 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197738 master-0 kubenswrapper[13046]: I0308 03:27:47.197541 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197738 master-0 kubenswrapper[13046]: I0308 03:27:47.197593 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sdt\" (UniqueName: \"kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.197956 master-0 kubenswrapper[13046]: I0308 03:27:47.197757 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.224624 master-0 kubenswrapper[13046]: I0308 03:27:47.224524 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 03:27:47.270438 master-0 kubenswrapper[13046]: I0308 03:27:47.270380 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zr868" Mar 08 03:27:47.273218 master-0 kubenswrapper[13046]: I0308 03:27:47.273187 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-kfwcc" Mar 08 03:27:47.298994 master-0 kubenswrapper[13046]: I0308 03:27:47.298904 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300451 master-0 kubenswrapper[13046]: I0308 03:27:47.300409 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300756 master-0 kubenswrapper[13046]: I0308 03:27:47.300463 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300756 master-0 kubenswrapper[13046]: I0308 03:27:47.300522 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300756 master-0 kubenswrapper[13046]: I0308 03:27:47.300575 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300756 master-0 kubenswrapper[13046]: I0308 03:27:47.300613 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.300756 master-0 kubenswrapper[13046]: I0308 03:27:47.300641 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sdt\" (UniqueName: \"kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.301150 master-0 kubenswrapper[13046]: I0308 03:27:47.300226 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.301889 master-0 kubenswrapper[13046]: I0308 03:27:47.301849 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.302705 master-0 kubenswrapper[13046]: I0308 03:27:47.302664 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.305383 master-0 kubenswrapper[13046]: I0308 03:27:47.305005 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.305383 master-0 kubenswrapper[13046]: I0308 03:27:47.305121 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 03:27:47.305684 master-0 kubenswrapper[13046]: I0308 03:27:47.305419 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 08 03:27:47.309870 master-0 kubenswrapper[13046]: I0308 03:27:47.308103 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.309870 master-0 kubenswrapper[13046]: I0308 03:27:47.308536 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.335372 master-0 kubenswrapper[13046]: I0308 03:27:47.335214 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sdt\" (UniqueName: \"kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt\") pod \"console-86bc7f4f4f-tzjk6\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.383733 master-0 kubenswrapper[13046]: I0308 03:27:47.383661 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-m2ccx" Mar 08 03:27:47.432285 master-0 kubenswrapper[13046]: I0308 03:27:47.432236 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 03:27:47.482590 master-0 kubenswrapper[13046]: I0308 03:27:47.482452 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:47.486733 master-0 kubenswrapper[13046]: I0308 03:27:47.486663 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 08 03:27:47.515759 master-0 kubenswrapper[13046]: I0308 03:27:47.515657 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 03:27:47.569610 master-0 kubenswrapper[13046]: I0308 03:27:47.569536 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 03:27:47.625378 master-0 kubenswrapper[13046]: I0308 03:27:47.625311 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-qktgm" Mar 08 03:27:47.689444 master-0 kubenswrapper[13046]: I0308 03:27:47.688399 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 08 03:27:47.715783 master-0 kubenswrapper[13046]: I0308 03:27:47.715726 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 03:27:47.803810 master-0 kubenswrapper[13046]: I0308 03:27:47.803727 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 03:27:47.836707 master-0 kubenswrapper[13046]: I0308 03:27:47.836635 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1allgkpdij0ou" Mar 08 03:27:47.847136 master-0 kubenswrapper[13046]: I0308 03:27:47.842191 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 03:27:47.924698 master-0 kubenswrapper[13046]: I0308 03:27:47.924556 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 03:27:48.032447 master-0 kubenswrapper[13046]: I0308 03:27:48.032367 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 03:27:48.052735 master-0 kubenswrapper[13046]: I0308 03:27:48.052702 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-lf85f" Mar 08 03:27:48.140263 master-0 kubenswrapper[13046]: I0308 03:27:48.140166 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 03:27:48.186283 master-0 kubenswrapper[13046]: I0308 03:27:48.186113 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 03:27:48.275725 master-0 kubenswrapper[13046]: I0308 03:27:48.275641 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 03:27:48.341833 master-0 kubenswrapper[13046]: I0308 03:27:48.341754 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 08 03:27:48.356231 master-0 kubenswrapper[13046]: I0308 03:27:48.356151 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 03:27:48.435705 master-0 kubenswrapper[13046]: I0308 03:27:48.435613 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 08 03:27:48.452424 master-0 kubenswrapper[13046]: I0308 03:27:48.452273 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 03:27:48.456965 master-0 kubenswrapper[13046]: I0308 03:27:48.456906 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 03:27:48.461977 master-0 kubenswrapper[13046]: I0308 03:27:48.461916 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 03:27:48.464010 master-0 kubenswrapper[13046]: I0308 03:27:48.463953 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 03:27:48.543918 master-0 kubenswrapper[13046]: I0308 03:27:48.543855 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 03:27:48.596738 master-0 kubenswrapper[13046]: I0308 03:27:48.596662 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-qr9xw" Mar 08 03:27:48.650457 master-0 kubenswrapper[13046]: I0308 03:27:48.650397 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 08 03:27:48.670811 master-0 kubenswrapper[13046]: I0308 03:27:48.670728 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-f5lxw" Mar 08 03:27:48.682341 master-0 kubenswrapper[13046]: I0308 03:27:48.682259 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 03:27:48.711183 master-0 kubenswrapper[13046]: I0308 03:27:48.711090 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 03:27:48.738760 master-0 kubenswrapper[13046]: I0308 03:27:48.738681 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 03:27:48.749297 master-0 kubenswrapper[13046]: I0308 03:27:48.749260 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 03:27:48.908875 master-0 kubenswrapper[13046]: I0308 03:27:48.908786 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 03:27:48.989239 master-0 kubenswrapper[13046]: I0308 03:27:48.989113 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 03:27:49.069760 master-0 kubenswrapper[13046]: I0308 03:27:49.069688 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 03:27:49.095425 master-0 kubenswrapper[13046]: I0308 03:27:49.095354 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 03:27:49.224887 master-0 kubenswrapper[13046]: I0308 03:27:49.224834 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 03:27:49.304693 master-0 kubenswrapper[13046]: I0308 03:27:49.304540 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-7c8qc" Mar 08 03:27:49.546058 master-0 kubenswrapper[13046]: I0308 03:27:49.545983 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 03:27:49.572692 master-0 kubenswrapper[13046]: I0308 03:27:49.572568 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 03:27:49.579014 master-0 kubenswrapper[13046]: I0308 03:27:49.576242 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:27:49.661073 master-0 kubenswrapper[13046]: I0308 03:27:49.661022 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 03:27:49.714453 master-0 kubenswrapper[13046]: I0308 03:27:49.714394 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 03:27:49.811077 master-0 kubenswrapper[13046]: I0308 03:27:49.811029 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-tqmc4" Mar 08 03:27:49.895782 master-0 kubenswrapper[13046]: I0308 03:27:49.894800 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 03:27:50.002652 master-0 kubenswrapper[13046]: I0308 03:27:50.002587 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:27:50.057155 master-0 kubenswrapper[13046]: I0308 03:27:50.057092 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 03:27:50.090514 master-0 kubenswrapper[13046]: I0308 03:27:50.090456 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-r5m92" Mar 08 03:27:50.347187 master-0 kubenswrapper[13046]: I0308 03:27:50.347125 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 03:27:50.408217 master-0 kubenswrapper[13046]: I0308 03:27:50.408160 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 03:27:50.623901 master-0 kubenswrapper[13046]: I0308 03:27:50.623855 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 03:27:50.638965 master-0 kubenswrapper[13046]: I0308 03:27:50.638945 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 08 03:27:50.713695 master-0 kubenswrapper[13046]: I0308 03:27:50.713628 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 03:27:50.821279 master-0 kubenswrapper[13046]: I0308 03:27:50.821221 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bc7f4f4f-tzjk6" event={"ID":"d9daabd8-0156-4337-b6d5-3eb664bf8663","Type":"ContainerStarted","Data":"37c1f48e626b236b64b19ec0a2dc05cfbb955ce39684dc039c90d4354c0a1689"} Mar 08 03:27:50.821540 master-0 kubenswrapper[13046]: I0308 03:27:50.821287 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bc7f4f4f-tzjk6" event={"ID":"d9daabd8-0156-4337-b6d5-3eb664bf8663","Type":"ContainerStarted","Data":"c5dffead47a409534ba1ccb9f71586e10015d1d905f119eedcac49cde347e78d"} Mar 08 03:27:50.848750 master-0 kubenswrapper[13046]: I0308 03:27:50.848693 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86bc7f4f4f-tzjk6" podStartSLOduration=8.848674423 podStartE2EDuration="8.848674423s" podCreationTimestamp="2026-03-08 03:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:27:50.847231114 +0000 UTC m=+872.925998371" watchObservedRunningTime="2026-03-08 03:27:50.848674423 +0000 UTC m=+872.927441640" Mar 08 03:27:50.881131 master-0 kubenswrapper[13046]: I0308 03:27:50.881026 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 08 03:27:51.490911 master-0 kubenswrapper[13046]: I0308 03:27:51.490792 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 08 03:27:51.505665 master-0 kubenswrapper[13046]: I0308 03:27:51.505388 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 03:27:51.512907 master-0 kubenswrapper[13046]: I0308 03:27:51.512843 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 08 03:27:51.641594 master-0 kubenswrapper[13046]: I0308 03:27:51.641555 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 03:27:51.641813 master-0 kubenswrapper[13046]: I0308 03:27:51.641635 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:51.670941 master-0 kubenswrapper[13046]: I0308 03:27:51.670884 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 03:27:51.684112 master-0 kubenswrapper[13046]: I0308 03:27:51.684051 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 03:27:51.684244 master-0 kubenswrapper[13046]: I0308 03:27:51.684118 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 03:27:51.684320 master-0 kubenswrapper[13046]: I0308 03:27:51.684292 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 03:27:51.684418 master-0 kubenswrapper[13046]: I0308 03:27:51.684373 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log" (OuterVolumeSpecName: "var-log") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:51.684461 master-0 kubenswrapper[13046]: I0308 03:27:51.684391 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 03:27:51.684509 master-0 kubenswrapper[13046]: I0308 03:27:51.684465 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock" (OuterVolumeSpecName: "var-lock") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:51.684509 master-0 kubenswrapper[13046]: I0308 03:27:51.684472 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 08 03:27:51.684575 master-0 kubenswrapper[13046]: I0308 03:27:51.684528 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests" (OuterVolumeSpecName: "manifests") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:51.684658 master-0 kubenswrapper[13046]: I0308 03:27:51.684633 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:51.685172 master-0 kubenswrapper[13046]: I0308 03:27:51.685140 13046 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:51.685219 master-0 kubenswrapper[13046]: I0308 03:27:51.685196 13046 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:51.685219 master-0 kubenswrapper[13046]: I0308 03:27:51.685212 13046 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:51.685276 master-0 kubenswrapper[13046]: I0308 03:27:51.685226 13046 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:51.692630 master-0 kubenswrapper[13046]: I0308 03:27:51.692562 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:27:51.788332 master-0 kubenswrapper[13046]: I0308 03:27:51.788142 13046 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:27:51.844274 master-0 kubenswrapper[13046]: I0308 03:27:51.844216 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 08 03:27:51.844713 master-0 kubenswrapper[13046]: I0308 03:27:51.844316 13046 generic.go:334] "Generic (PLEG): container finished" podID="a814bd60de133d95cf99630a978c017e" containerID="9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3" exitCode=137 Mar 08 03:27:51.844713 master-0 kubenswrapper[13046]: I0308 03:27:51.844403 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 08 03:27:51.844713 master-0 kubenswrapper[13046]: I0308 03:27:51.844458 13046 scope.go:117] "RemoveContainer" containerID="9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3" Mar 08 03:27:51.876862 master-0 kubenswrapper[13046]: I0308 03:27:51.876125 13046 scope.go:117] "RemoveContainer" containerID="9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3" Mar 08 03:27:51.878104 master-0 kubenswrapper[13046]: E0308 03:27:51.878047 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3\": container with ID starting with 9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3 not found: ID does not exist" containerID="9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3" Mar 08 03:27:51.878272 master-0 kubenswrapper[13046]: I0308 03:27:51.878106 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3"} err="failed to get container status \"9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3\": rpc error: code = NotFound desc = could not find container \"9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3\": container with ID starting with 9903349b774f9f7e7b55fdea42363d8b7543900611c9a67b76a69d9b7a94cbf3 not found: ID does not exist" Mar 08 03:27:52.069574 master-0 kubenswrapper[13046]: I0308 03:27:52.069380 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 03:27:52.128832 master-0 kubenswrapper[13046]: I0308 03:27:52.128741 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814bd60de133d95cf99630a978c017e" path="/var/lib/kubelet/pods/a814bd60de133d95cf99630a978c017e/volumes" Mar 08 03:27:52.705666 master-0 kubenswrapper[13046]: I0308 03:27:52.705576 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 03:27:57.482836 master-0 kubenswrapper[13046]: I0308 03:27:57.482759 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:57.482836 master-0 kubenswrapper[13046]: I0308 03:27:57.482812 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:57.488907 master-0 kubenswrapper[13046]: I0308 03:27:57.488829 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:57.901037 master-0 kubenswrapper[13046]: I0308 03:27:57.900887 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:27:58.043291 master-0 kubenswrapper[13046]: I0308 03:27:58.043035 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:28:07.937767 master-0 kubenswrapper[13046]: I0308 03:28:07.937652 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-668dfc897d-db2r2" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" containerID="cri-o://b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725" gracePeriod=15 Mar 08 03:28:08.543012 master-0 kubenswrapper[13046]: I0308 03:28:08.542925 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668dfc897d-db2r2_344be988-4e0e-46c9-9ba9-a84e76abe7bc/console/0.log" Mar 08 03:28:08.543012 master-0 kubenswrapper[13046]: I0308 03:28:08.543009 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596285 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98wdn\" (UniqueName: \"kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596396 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596433 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596493 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596514 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.596624 master-0 kubenswrapper[13046]: I0308 03:28:08.596552 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.597179 master-0 kubenswrapper[13046]: I0308 03:28:08.596707 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle\") pod \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\" (UID: \"344be988-4e0e-46c9-9ba9-a84e76abe7bc\") " Mar 08 03:28:08.597337 master-0 kubenswrapper[13046]: I0308 03:28:08.597288 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:08.597650 master-0 kubenswrapper[13046]: I0308 03:28:08.597612 13046 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.598584 master-0 kubenswrapper[13046]: I0308 03:28:08.598469 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config" (OuterVolumeSpecName: "console-config") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:08.598691 master-0 kubenswrapper[13046]: I0308 03:28:08.598512 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca" (OuterVolumeSpecName: "service-ca") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:08.599206 master-0 kubenswrapper[13046]: I0308 03:28:08.599134 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:08.601821 master-0 kubenswrapper[13046]: I0308 03:28:08.601306 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:08.602686 master-0 kubenswrapper[13046]: I0308 03:28:08.602335 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:08.608608 master-0 kubenswrapper[13046]: I0308 03:28:08.603171 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn" (OuterVolumeSpecName: "kube-api-access-98wdn") pod "344be988-4e0e-46c9-9ba9-a84e76abe7bc" (UID: "344be988-4e0e-46c9-9ba9-a84e76abe7bc"). InnerVolumeSpecName "kube-api-access-98wdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:28:08.699796 master-0 kubenswrapper[13046]: I0308 03:28:08.699707 13046 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.699796 master-0 kubenswrapper[13046]: I0308 03:28:08.699771 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98wdn\" (UniqueName: \"kubernetes.io/projected/344be988-4e0e-46c9-9ba9-a84e76abe7bc-kube-api-access-98wdn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.699796 master-0 kubenswrapper[13046]: I0308 03:28:08.699792 13046 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.699796 master-0 kubenswrapper[13046]: I0308 03:28:08.699808 13046 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.699796 master-0 kubenswrapper[13046]: I0308 03:28:08.699820 13046 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:08.701160 master-0 kubenswrapper[13046]: I0308 03:28:08.699833 13046 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/344be988-4e0e-46c9-9ba9-a84e76abe7bc-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:09.031694 master-0 kubenswrapper[13046]: I0308 03:28:09.031645 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668dfc897d-db2r2_344be988-4e0e-46c9-9ba9-a84e76abe7bc/console/0.log" Mar 08 03:28:09.032331 master-0 kubenswrapper[13046]: I0308 03:28:09.031749 13046 generic.go:334] "Generic (PLEG): container finished" podID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerID="b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725" exitCode=2 Mar 08 03:28:09.032331 master-0 kubenswrapper[13046]: I0308 03:28:09.031809 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668dfc897d-db2r2" event={"ID":"344be988-4e0e-46c9-9ba9-a84e76abe7bc","Type":"ContainerDied","Data":"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725"} Mar 08 03:28:09.032331 master-0 kubenswrapper[13046]: I0308 03:28:09.031871 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668dfc897d-db2r2" event={"ID":"344be988-4e0e-46c9-9ba9-a84e76abe7bc","Type":"ContainerDied","Data":"245ea35bfc2f9c3196ff568e863b40afee8588a6a70a8ce6270746ae95ff79e0"} Mar 08 03:28:09.032331 master-0 kubenswrapper[13046]: I0308 03:28:09.031863 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668dfc897d-db2r2" Mar 08 03:28:09.032331 master-0 kubenswrapper[13046]: I0308 03:28:09.031893 13046 scope.go:117] "RemoveContainer" containerID="b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725" Mar 08 03:28:09.063839 master-0 kubenswrapper[13046]: I0308 03:28:09.063783 13046 scope.go:117] "RemoveContainer" containerID="b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725" Mar 08 03:28:09.065815 master-0 kubenswrapper[13046]: E0308 03:28:09.065746 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725\": container with ID starting with b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725 not found: ID does not exist" containerID="b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725" Mar 08 03:28:09.065934 master-0 kubenswrapper[13046]: I0308 03:28:09.065800 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725"} err="failed to get container status \"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725\": rpc error: code = NotFound desc = could not find container \"b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725\": container with ID starting with b215577186d164f1157e522f623f5c162b596c259538812109a822ee95367725 not found: ID does not exist" Mar 08 03:28:09.098455 master-0 kubenswrapper[13046]: I0308 03:28:09.098386 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:28:09.114242 master-0 kubenswrapper[13046]: I0308 03:28:09.114188 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-668dfc897d-db2r2"] Mar 08 03:28:10.135396 master-0 kubenswrapper[13046]: I0308 03:28:10.135325 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" path="/var/lib/kubelet/pods/344be988-4e0e-46c9-9ba9-a84e76abe7bc/volumes" Mar 08 03:28:23.082296 master-0 kubenswrapper[13046]: I0308 03:28:23.082233 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" containerID="cri-o://3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666" gracePeriod=15 Mar 08 03:28:23.554317 master-0 kubenswrapper[13046]: I0308 03:28:23.554276 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7977cd7c97-8tssk_16bab511-b73b-4215-824b-641d45e7987b/console/0.log" Mar 08 03:28:23.554545 master-0 kubenswrapper[13046]: I0308 03:28:23.554352 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662519 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662587 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662621 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662642 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662733 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-599cs\" (UniqueName: \"kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662759 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.662817 master-0 kubenswrapper[13046]: I0308 03:28:23.662777 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert\") pod \"16bab511-b73b-4215-824b-641d45e7987b\" (UID: \"16bab511-b73b-4215-824b-641d45e7987b\") " Mar 08 03:28:23.666234 master-0 kubenswrapper[13046]: I0308 03:28:23.666185 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca" (OuterVolumeSpecName: "service-ca") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:23.666307 master-0 kubenswrapper[13046]: I0308 03:28:23.666245 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config" (OuterVolumeSpecName: "console-config") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:23.666344 master-0 kubenswrapper[13046]: I0308 03:28:23.666259 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:23.667063 master-0 kubenswrapper[13046]: I0308 03:28:23.667033 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:23.668020 master-0 kubenswrapper[13046]: I0308 03:28:23.667980 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:23.676944 master-0 kubenswrapper[13046]: I0308 03:28:23.676320 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs" (OuterVolumeSpecName: "kube-api-access-599cs") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "kube-api-access-599cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:28:23.687635 master-0 kubenswrapper[13046]: I0308 03:28:23.687588 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "16bab511-b73b-4215-824b-641d45e7987b" (UID: "16bab511-b73b-4215-824b-641d45e7987b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767771 13046 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767816 13046 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767827 13046 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767838 13046 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767847 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-599cs\" (UniqueName: \"kubernetes.io/projected/16bab511-b73b-4215-824b-641d45e7987b-kube-api-access-599cs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767856 13046 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16bab511-b73b-4215-824b-641d45e7987b-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:23.767898 master-0 kubenswrapper[13046]: I0308 03:28:23.767864 13046 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16bab511-b73b-4215-824b-641d45e7987b-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:24.195154 master-0 kubenswrapper[13046]: I0308 03:28:24.195106 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7977cd7c97-8tssk_16bab511-b73b-4215-824b-641d45e7987b/console/0.log" Mar 08 03:28:24.195722 master-0 kubenswrapper[13046]: I0308 03:28:24.195160 13046 generic.go:334] "Generic (PLEG): container finished" podID="16bab511-b73b-4215-824b-641d45e7987b" containerID="3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666" exitCode=2 Mar 08 03:28:24.195722 master-0 kubenswrapper[13046]: I0308 03:28:24.195191 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7977cd7c97-8tssk" event={"ID":"16bab511-b73b-4215-824b-641d45e7987b","Type":"ContainerDied","Data":"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666"} Mar 08 03:28:24.195722 master-0 kubenswrapper[13046]: I0308 03:28:24.195221 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7977cd7c97-8tssk" event={"ID":"16bab511-b73b-4215-824b-641d45e7987b","Type":"ContainerDied","Data":"67bf768e562b8de1c76885a3794a14013dfc5d68445bd6ce75388a56d49f7f17"} Mar 08 03:28:24.195722 master-0 kubenswrapper[13046]: I0308 03:28:24.195242 13046 scope.go:117] "RemoveContainer" containerID="3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666" Mar 08 03:28:24.195722 master-0 kubenswrapper[13046]: I0308 03:28:24.195375 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7977cd7c97-8tssk" Mar 08 03:28:24.212920 master-0 kubenswrapper[13046]: I0308 03:28:24.212875 13046 scope.go:117] "RemoveContainer" containerID="3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666" Mar 08 03:28:24.213702 master-0 kubenswrapper[13046]: E0308 03:28:24.213406 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666\": container with ID starting with 3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666 not found: ID does not exist" containerID="3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666" Mar 08 03:28:24.213762 master-0 kubenswrapper[13046]: I0308 03:28:24.213716 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666"} err="failed to get container status \"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666\": rpc error: code = NotFound desc = could not find container \"3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666\": container with ID starting with 3c3defedb6ddf32c2e2858af90bf8e47e03945348022419acfb4077410451666 not found: ID does not exist" Mar 08 03:28:24.227460 master-0 kubenswrapper[13046]: I0308 03:28:24.227387 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:28:24.236456 master-0 kubenswrapper[13046]: I0308 03:28:24.236402 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7977cd7c97-8tssk"] Mar 08 03:28:24.420803 master-0 kubenswrapper[13046]: I0308 03:28:24.420656 13046 patch_prober.go:28] interesting pod/console-7977cd7c97-8tssk container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.128.0.103:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 03:28:24.420803 master-0 kubenswrapper[13046]: I0308 03:28:24.420745 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-7977cd7c97-8tssk" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 03:28:24.508390 master-0 kubenswrapper[13046]: I0308 03:28:24.508334 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508652 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="prometheus" containerID="cri-o://b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" gracePeriod=600 Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508723 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy" containerID="cri-o://b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" gracePeriod=600 Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508777 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="config-reloader" containerID="cri-o://fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" gracePeriod=600 Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508768 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-web" containerID="cri-o://3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" gracePeriod=600 Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508845 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-thanos" containerID="cri-o://09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" gracePeriod=600 Mar 08 03:28:24.509837 master-0 kubenswrapper[13046]: I0308 03:28:24.508745 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="thanos-sidecar" containerID="cri-o://f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" gracePeriod=600 Mar 08 03:28:24.993827 master-0 kubenswrapper[13046]: I0308 03:28:24.993306 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090284 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090343 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090396 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090418 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090450 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090468 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090510 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090529 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090563 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090585 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090607 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090621 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090637 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090692 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxxj4\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090724 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090745 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090763 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.090797 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle\") pod \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\" (UID: \"b36ba129-ccfe-4dfc-889a-aa98d4dece4a\") " Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.091633 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.092026 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.094417 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:28:25.097504 master-0 kubenswrapper[13046]: I0308 03:28:25.096209 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:25.106516 master-0 kubenswrapper[13046]: I0308 03:28:25.100214 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:25.106516 master-0 kubenswrapper[13046]: I0308 03:28:25.104673 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.106516 master-0 kubenswrapper[13046]: I0308 03:28:25.105032 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:28:25.125911 master-0 kubenswrapper[13046]: I0308 03:28:25.122741 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.125911 master-0 kubenswrapper[13046]: I0308 03:28:25.123870 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config" (OuterVolumeSpecName: "config") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.127951 master-0 kubenswrapper[13046]: I0308 03:28:25.127655 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4" (OuterVolumeSpecName: "kube-api-access-wxxj4") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "kube-api-access-wxxj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:28:25.130792 master-0 kubenswrapper[13046]: I0308 03:28:25.128674 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out" (OuterVolumeSpecName: "config-out") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:28:25.130792 master-0 kubenswrapper[13046]: I0308 03:28:25.128791 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:28:25.130792 master-0 kubenswrapper[13046]: I0308 03:28:25.130738 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.133612 master-0 kubenswrapper[13046]: I0308 03:28:25.131695 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.133612 master-0 kubenswrapper[13046]: I0308 03:28:25.132384 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.154525 master-0 kubenswrapper[13046]: I0308 03:28:25.136143 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.154525 master-0 kubenswrapper[13046]: I0308 03:28:25.153635 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193309 13046 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193356 13046 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193366 13046 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193380 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193393 13046 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193392 master-0 kubenswrapper[13046]: I0308 03:28:25.193405 13046 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193417 13046 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193432 13046 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193448 13046 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193459 13046 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193470 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxxj4\" (UniqueName: \"kubernetes.io/projected/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-kube-api-access-wxxj4\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193478 13046 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-config-out\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193502 13046 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193511 13046 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193520 13046 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193530 13046 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.193876 master-0 kubenswrapper[13046]: I0308 03:28:25.193539 13046 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.213234 master-0 kubenswrapper[13046]: I0308 03:28:25.213177 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config" (OuterVolumeSpecName: "web-config") pod "b36ba129-ccfe-4dfc-889a-aa98d4dece4a" (UID: "b36ba129-ccfe-4dfc-889a-aa98d4dece4a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236873 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236909 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236917 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236925 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236931 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236938 13046 generic.go:334] "Generic (PLEG): container finished" podID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" exitCode=0 Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236960 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236987 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} Mar 08 03:28:25.236990 master-0 kubenswrapper[13046]: I0308 03:28:25.236997 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237007 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237019 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237028 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237037 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b36ba129-ccfe-4dfc-889a-aa98d4dece4a","Type":"ContainerDied","Data":"b972771ed331a46f9d80ea803f2110481c6c95662feba692274813259f88d537"} Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237070 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.237398 master-0 kubenswrapper[13046]: I0308 03:28:25.237179 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.255236 master-0 kubenswrapper[13046]: I0308 03:28:25.254903 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.274998 master-0 kubenswrapper[13046]: I0308 03:28:25.274880 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.286859 master-0 kubenswrapper[13046]: I0308 03:28:25.286819 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:25.296581 master-0 kubenswrapper[13046]: I0308 03:28:25.294365 13046 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b36ba129-ccfe-4dfc-889a-aa98d4dece4a-web-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:28:25.296581 master-0 kubenswrapper[13046]: I0308 03:28:25.294405 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:25.306007 master-0 kubenswrapper[13046]: I0308 03:28:25.303630 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.324171 master-0 kubenswrapper[13046]: I0308 03:28:25.324034 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324711 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-thanos" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324731 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-thanos" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324740 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="prometheus" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324748 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="prometheus" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324759 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324765 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324781 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="thanos-sidecar" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324787 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="thanos-sidecar" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324808 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="init-config-reloader" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324814 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="init-config-reloader" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324823 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="config-reloader" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324830 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="config-reloader" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324842 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324848 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324860 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-web" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324867 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-web" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: E0308 03:28:25.324878 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324883 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.324985 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="344be988-4e0e-46c9-9ba9-a84e76abe7bc" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325005 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="prometheus" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325017 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="thanos-sidecar" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325027 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="16bab511-b73b-4215-824b-641d45e7987b" containerName="console" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325036 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="config-reloader" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325052 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-web" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325064 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy-thanos" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.325078 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" containerName="kube-rbac-proxy" Mar 08 03:28:25.327549 master-0 kubenswrapper[13046]: I0308 03:28:25.326862 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.331652 master-0 kubenswrapper[13046]: I0308 03:28:25.331613 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-xfc6f" Mar 08 03:28:25.331819 master-0 kubenswrapper[13046]: I0308 03:28:25.331791 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 08 03:28:25.331860 master-0 kubenswrapper[13046]: I0308 03:28:25.331817 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 08 03:28:25.331891 master-0 kubenswrapper[13046]: I0308 03:28:25.331864 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 08 03:28:25.331935 master-0 kubenswrapper[13046]: I0308 03:28:25.331829 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 08 03:28:25.331974 master-0 kubenswrapper[13046]: I0308 03:28:25.331954 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 08 03:28:25.331974 master-0 kubenswrapper[13046]: I0308 03:28:25.331625 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 08 03:28:25.332040 master-0 kubenswrapper[13046]: I0308 03:28:25.332025 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 08 03:28:25.332079 master-0 kubenswrapper[13046]: I0308 03:28:25.332054 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 08 03:28:25.332619 master-0 kubenswrapper[13046]: I0308 03:28:25.332595 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 08 03:28:25.332748 master-0 kubenswrapper[13046]: I0308 03:28:25.332727 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-1allgkpdij0ou" Mar 08 03:28:25.335121 master-0 kubenswrapper[13046]: I0308 03:28:25.335052 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.342502 master-0 kubenswrapper[13046]: I0308 03:28:25.342155 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 08 03:28:25.343785 master-0 kubenswrapper[13046]: I0308 03:28:25.343755 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 08 03:28:25.357551 master-0 kubenswrapper[13046]: I0308 03:28:25.353958 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:25.390656 master-0 kubenswrapper[13046]: I0308 03:28:25.390545 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.395470 master-0 kubenswrapper[13046]: I0308 03:28:25.395434 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-config-out\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395643 master-0 kubenswrapper[13046]: I0308 03:28:25.395616 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395700 master-0 kubenswrapper[13046]: I0308 03:28:25.395651 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395700 master-0 kubenswrapper[13046]: I0308 03:28:25.395690 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395756 master-0 kubenswrapper[13046]: I0308 03:28:25.395727 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395756 master-0 kubenswrapper[13046]: I0308 03:28:25.395747 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395820 master-0 kubenswrapper[13046]: I0308 03:28:25.395766 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395820 master-0 kubenswrapper[13046]: I0308 03:28:25.395784 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395820 master-0 kubenswrapper[13046]: I0308 03:28:25.395805 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395910 master-0 kubenswrapper[13046]: I0308 03:28:25.395822 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395910 master-0 kubenswrapper[13046]: I0308 03:28:25.395842 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-web-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395910 master-0 kubenswrapper[13046]: I0308 03:28:25.395858 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395910 master-0 kubenswrapper[13046]: I0308 03:28:25.395887 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.395910 master-0 kubenswrapper[13046]: I0308 03:28:25.395901 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzkj\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-kube-api-access-jwzkj\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.396108 master-0 kubenswrapper[13046]: I0308 03:28:25.395926 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.396108 master-0 kubenswrapper[13046]: I0308 03:28:25.395946 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.396108 master-0 kubenswrapper[13046]: I0308 03:28:25.395959 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.396108 master-0 kubenswrapper[13046]: I0308 03:28:25.395980 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.408719 master-0 kubenswrapper[13046]: I0308 03:28:25.408681 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.424591 master-0 kubenswrapper[13046]: I0308 03:28:25.424092 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.424591 master-0 kubenswrapper[13046]: E0308 03:28:25.424409 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.424591 master-0 kubenswrapper[13046]: I0308 03:28:25.424439 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.424591 master-0 kubenswrapper[13046]: I0308 03:28:25.424459 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: E0308 03:28:25.424882 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: I0308 03:28:25.424906 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: I0308 03:28:25.424919 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: E0308 03:28:25.425077 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: I0308 03:28:25.425092 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.425226 master-0 kubenswrapper[13046]: I0308 03:28:25.425107 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.425442 master-0 kubenswrapper[13046]: E0308 03:28:25.425267 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.425442 master-0 kubenswrapper[13046]: I0308 03:28:25.425284 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.425442 master-0 kubenswrapper[13046]: I0308 03:28:25.425296 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.425550 master-0 kubenswrapper[13046]: E0308 03:28:25.425454 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.425550 master-0 kubenswrapper[13046]: I0308 03:28:25.425469 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.425615 master-0 kubenswrapper[13046]: I0308 03:28:25.425592 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.425860 master-0 kubenswrapper[13046]: E0308 03:28:25.425833 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.425905 master-0 kubenswrapper[13046]: I0308 03:28:25.425857 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.425905 master-0 kubenswrapper[13046]: I0308 03:28:25.425887 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.426333 master-0 kubenswrapper[13046]: E0308 03:28:25.426296 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.426388 master-0 kubenswrapper[13046]: I0308 03:28:25.426345 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.426388 master-0 kubenswrapper[13046]: I0308 03:28:25.426373 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.427356 master-0 kubenswrapper[13046]: I0308 03:28:25.427321 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.427450 master-0 kubenswrapper[13046]: I0308 03:28:25.427436 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.427869 master-0 kubenswrapper[13046]: I0308 03:28:25.427852 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.427947 master-0 kubenswrapper[13046]: I0308 03:28:25.427936 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.428177 master-0 kubenswrapper[13046]: I0308 03:28:25.428156 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.428249 master-0 kubenswrapper[13046]: I0308 03:28:25.428238 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.431051 master-0 kubenswrapper[13046]: I0308 03:28:25.430969 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.431051 master-0 kubenswrapper[13046]: I0308 03:28:25.430996 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.431303 master-0 kubenswrapper[13046]: I0308 03:28:25.431277 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.431360 master-0 kubenswrapper[13046]: I0308 03:28:25.431301 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431508 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431531 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431743 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431758 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431955 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.431969 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432122 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432136 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432277 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432291 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432431 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432447 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432608 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.432624 master-0 kubenswrapper[13046]: I0308 03:28:25.432621 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.433216 master-0 kubenswrapper[13046]: I0308 03:28:25.432766 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.433216 master-0 kubenswrapper[13046]: I0308 03:28:25.432779 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.433590 master-0 kubenswrapper[13046]: I0308 03:28:25.433432 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.433590 master-0 kubenswrapper[13046]: I0308 03:28:25.433457 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.433755 master-0 kubenswrapper[13046]: I0308 03:28:25.433674 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.433755 master-0 kubenswrapper[13046]: I0308 03:28:25.433693 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.433978 master-0 kubenswrapper[13046]: I0308 03:28:25.433863 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.433978 master-0 kubenswrapper[13046]: I0308 03:28:25.433884 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.434079 master-0 kubenswrapper[13046]: I0308 03:28:25.434030 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.434079 master-0 kubenswrapper[13046]: I0308 03:28:25.434043 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.434213 master-0 kubenswrapper[13046]: I0308 03:28:25.434183 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.434213 master-0 kubenswrapper[13046]: I0308 03:28:25.434204 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.434378 master-0 kubenswrapper[13046]: I0308 03:28:25.434344 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.434378 master-0 kubenswrapper[13046]: I0308 03:28:25.434362 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.434790 master-0 kubenswrapper[13046]: I0308 03:28:25.434742 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.434790 master-0 kubenswrapper[13046]: I0308 03:28:25.434763 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.434990 master-0 kubenswrapper[13046]: I0308 03:28:25.434967 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.434990 master-0 kubenswrapper[13046]: I0308 03:28:25.434985 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.435334 master-0 kubenswrapper[13046]: I0308 03:28:25.435309 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.435334 master-0 kubenswrapper[13046]: I0308 03:28:25.435325 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.437466 master-0 kubenswrapper[13046]: I0308 03:28:25.436793 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.437466 master-0 kubenswrapper[13046]: I0308 03:28:25.436813 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.437466 master-0 kubenswrapper[13046]: I0308 03:28:25.437005 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.437466 master-0 kubenswrapper[13046]: I0308 03:28:25.437019 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.437773 master-0 kubenswrapper[13046]: I0308 03:28:25.437718 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.437820 master-0 kubenswrapper[13046]: I0308 03:28:25.437770 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.438240 master-0 kubenswrapper[13046]: I0308 03:28:25.438103 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.438240 master-0 kubenswrapper[13046]: I0308 03:28:25.438145 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.438393 master-0 kubenswrapper[13046]: I0308 03:28:25.438362 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.438393 master-0 kubenswrapper[13046]: I0308 03:28:25.438385 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.438593 master-0 kubenswrapper[13046]: I0308 03:28:25.438566 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.438593 master-0 kubenswrapper[13046]: I0308 03:28:25.438587 13046 scope.go:117] "RemoveContainer" containerID="09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509" Mar 08 03:28:25.438784 master-0 kubenswrapper[13046]: I0308 03:28:25.438761 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509"} err="failed to get container status \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": rpc error: code = NotFound desc = could not find container \"09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509\": container with ID starting with 09e9ac517f14e8b66d6155208bfe108221241cf2b6df0696fd6fc61fd9a4f509 not found: ID does not exist" Mar 08 03:28:25.438784 master-0 kubenswrapper[13046]: I0308 03:28:25.438782 13046 scope.go:117] "RemoveContainer" containerID="b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b" Mar 08 03:28:25.438995 master-0 kubenswrapper[13046]: I0308 03:28:25.438975 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b"} err="failed to get container status \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": rpc error: code = NotFound desc = could not find container \"b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b\": container with ID starting with b5f22a28e0844be19e9520df1b6f3893f0006d48638763365242ab1b0072b21b not found: ID does not exist" Mar 08 03:28:25.439032 master-0 kubenswrapper[13046]: I0308 03:28:25.438994 13046 scope.go:117] "RemoveContainer" containerID="3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f" Mar 08 03:28:25.439164 master-0 kubenswrapper[13046]: I0308 03:28:25.439145 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f"} err="failed to get container status \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": rpc error: code = NotFound desc = could not find container \"3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f\": container with ID starting with 3c169fbbac29c36c49b34e68c4aa3d664e10b4b609be6e09523fb6ad2c126a1f not found: ID does not exist" Mar 08 03:28:25.439199 master-0 kubenswrapper[13046]: I0308 03:28:25.439164 13046 scope.go:117] "RemoveContainer" containerID="f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae" Mar 08 03:28:25.439344 master-0 kubenswrapper[13046]: I0308 03:28:25.439322 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae"} err="failed to get container status \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": rpc error: code = NotFound desc = could not find container \"f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae\": container with ID starting with f491ab235133d7edea549b96d0c4c88d85e032338e5dc546831b20e034b92bae not found: ID does not exist" Mar 08 03:28:25.439387 master-0 kubenswrapper[13046]: I0308 03:28:25.439343 13046 scope.go:117] "RemoveContainer" containerID="fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324" Mar 08 03:28:25.439572 master-0 kubenswrapper[13046]: I0308 03:28:25.439551 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324"} err="failed to get container status \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": rpc error: code = NotFound desc = could not find container \"fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324\": container with ID starting with fcb0ccb06060f914b6bb1f412c90e18ac5690ea617af083888b13ddc76576324 not found: ID does not exist" Mar 08 03:28:25.439572 master-0 kubenswrapper[13046]: I0308 03:28:25.439570 13046 scope.go:117] "RemoveContainer" containerID="b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093" Mar 08 03:28:25.439800 master-0 kubenswrapper[13046]: I0308 03:28:25.439779 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093"} err="failed to get container status \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": rpc error: code = NotFound desc = could not find container \"b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093\": container with ID starting with b9f1b8347b4e67b7afa8f4d533f8c367bb08f109311781b76de18d8892892093 not found: ID does not exist" Mar 08 03:28:25.439800 master-0 kubenswrapper[13046]: I0308 03:28:25.439798 13046 scope.go:117] "RemoveContainer" containerID="bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09" Mar 08 03:28:25.444327 master-0 kubenswrapper[13046]: I0308 03:28:25.444184 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09"} err="failed to get container status \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": rpc error: code = NotFound desc = could not find container \"bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09\": container with ID starting with bdb1f2b1f9de042d56c06af1bbed00cf2baa7c612be5d5376c161bff3534ec09 not found: ID does not exist" Mar 08 03:28:25.497364 master-0 kubenswrapper[13046]: I0308 03:28:25.497238 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.497561 master-0 kubenswrapper[13046]: I0308 03:28:25.497439 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.497561 master-0 kubenswrapper[13046]: I0308 03:28:25.497544 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.497890 master-0 kubenswrapper[13046]: I0308 03:28:25.497841 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-config-out\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.497975 master-0 kubenswrapper[13046]: I0308 03:28:25.497956 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498024 master-0 kubenswrapper[13046]: I0308 03:28:25.498004 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498082 master-0 kubenswrapper[13046]: I0308 03:28:25.498065 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498136 master-0 kubenswrapper[13046]: I0308 03:28:25.498118 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498170 master-0 kubenswrapper[13046]: I0308 03:28:25.498159 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498203 master-0 kubenswrapper[13046]: I0308 03:28:25.498186 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498240 master-0 kubenswrapper[13046]: I0308 03:28:25.498213 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.498280 master-0 kubenswrapper[13046]: I0308 03:28:25.498259 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498284 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498354 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-web-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498382 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498454 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498507 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzkj\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-kube-api-access-jwzkj\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498572 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.498665 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.499240 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.499571 master-0 kubenswrapper[13046]: I0308 03:28:25.499566 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.500365 master-0 kubenswrapper[13046]: I0308 03:28:25.500332 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.500938 master-0 kubenswrapper[13046]: I0308 03:28:25.500914 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.505834 master-0 kubenswrapper[13046]: I0308 03:28:25.505289 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77985828-fe36-4c71-afe2-3b0a69f6220b-config-out\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.505834 master-0 kubenswrapper[13046]: I0308 03:28:25.505784 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.505834 master-0 kubenswrapper[13046]: I0308 03:28:25.505832 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.505994 master-0 kubenswrapper[13046]: I0308 03:28:25.505837 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.506054 master-0 kubenswrapper[13046]: I0308 03:28:25.506004 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.509853 master-0 kubenswrapper[13046]: I0308 03:28:25.506122 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.509853 master-0 kubenswrapper[13046]: I0308 03:28:25.508521 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77985828-fe36-4c71-afe2-3b0a69f6220b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.510331 master-0 kubenswrapper[13046]: I0308 03:28:25.510293 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.510475 master-0 kubenswrapper[13046]: I0308 03:28:25.510441 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.511038 master-0 kubenswrapper[13046]: I0308 03:28:25.511002 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-web-config\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.511075 master-0 kubenswrapper[13046]: I0308 03:28:25.511010 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.511476 master-0 kubenswrapper[13046]: I0308 03:28:25.511450 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77985828-fe36-4c71-afe2-3b0a69f6220b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.522245 master-0 kubenswrapper[13046]: I0308 03:28:25.522200 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzkj\" (UniqueName: \"kubernetes.io/projected/77985828-fe36-4c71-afe2-3b0a69f6220b-kube-api-access-jwzkj\") pod \"prometheus-k8s-0\" (UID: \"77985828-fe36-4c71-afe2-3b0a69f6220b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:25.660192 master-0 kubenswrapper[13046]: I0308 03:28:25.660133 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:28:26.111583 master-0 kubenswrapper[13046]: I0308 03:28:26.111522 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 08 03:28:26.128200 master-0 kubenswrapper[13046]: I0308 03:28:26.128157 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bab511-b73b-4215-824b-641d45e7987b" path="/var/lib/kubelet/pods/16bab511-b73b-4215-824b-641d45e7987b/volumes" Mar 08 03:28:26.128952 master-0 kubenswrapper[13046]: I0308 03:28:26.128926 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36ba129-ccfe-4dfc-889a-aa98d4dece4a" path="/var/lib/kubelet/pods/b36ba129-ccfe-4dfc-889a-aa98d4dece4a/volumes" Mar 08 03:28:26.243933 master-0 kubenswrapper[13046]: I0308 03:28:26.243891 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"425fc4279faeef572b507d13ccaabd5cdad78ebe1b43440e7c11f35f72ad9449"} Mar 08 03:28:27.251982 master-0 kubenswrapper[13046]: I0308 03:28:27.251931 13046 generic.go:334] "Generic (PLEG): container finished" podID="77985828-fe36-4c71-afe2-3b0a69f6220b" containerID="0a2f1d383880de2fd4a7ddfae350554ff69e9da9cf423cde9ae5df4511dca45f" exitCode=0 Mar 08 03:28:27.251982 master-0 kubenswrapper[13046]: I0308 03:28:27.251979 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerDied","Data":"0a2f1d383880de2fd4a7ddfae350554ff69e9da9cf423cde9ae5df4511dca45f"} Mar 08 03:28:28.268961 master-0 kubenswrapper[13046]: I0308 03:28:28.268904 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"56af6ffe1ef623416632f8daf38d753b6a480a1118ba9d5f9f6a6d91cf15d3de"} Mar 08 03:28:28.269413 master-0 kubenswrapper[13046]: I0308 03:28:28.268973 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"69af6054a6ef4aa8ef36e1b1db9274f1a12e9bcb4a88ec698e25bfb014c904d0"} Mar 08 03:28:28.269413 master-0 kubenswrapper[13046]: I0308 03:28:28.268995 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"952fc0443a77a9fbb4209583f538f81459f5e0c9824182b7c11d7a1174ae8207"} Mar 08 03:28:28.269413 master-0 kubenswrapper[13046]: I0308 03:28:28.269015 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"5338c40e1876a4c76f72d6bf0b6765368aa0c7ac0c159e383b5b7fe15a566887"} Mar 08 03:28:28.269413 master-0 kubenswrapper[13046]: I0308 03:28:28.269031 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"ba94f87066691387a4a38a2657df4f43a9009e46e514d851e4fc2b9cfee5f054"} Mar 08 03:28:28.269413 master-0 kubenswrapper[13046]: I0308 03:28:28.269052 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77985828-fe36-4c71-afe2-3b0a69f6220b","Type":"ContainerStarted","Data":"82d976327a2f7e791914b2a9056f0a8127ccd9519dc4b6dbdda901fcfab80858"} Mar 08 03:28:28.331046 master-0 kubenswrapper[13046]: I0308 03:28:28.330865 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.330830768 podStartE2EDuration="3.330830768s" podCreationTimestamp="2026-03-08 03:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:28:28.317357965 +0000 UTC m=+910.396125222" watchObservedRunningTime="2026-03-08 03:28:28.330830768 +0000 UTC m=+910.409598025" Mar 08 03:28:30.660789 master-0 kubenswrapper[13046]: I0308 03:28:30.660687 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:29:18.735074 master-0 kubenswrapper[13046]: I0308 03:29:18.734981 13046 scope.go:117] "RemoveContainer" containerID="94498e732862075a2e7db935be27489aa50ccc49f721fd0c7a11e45e6a5920c8" Mar 08 03:29:25.660860 master-0 kubenswrapper[13046]: I0308 03:29:25.660736 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:29:25.709086 master-0 kubenswrapper[13046]: I0308 03:29:25.708987 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:29:25.903408 master-0 kubenswrapper[13046]: I0308 03:29:25.903356 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 08 03:29:47.781556 master-0 kubenswrapper[13046]: I0308 03:29:47.781454 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:29:47.782980 master-0 kubenswrapper[13046]: I0308 03:29:47.782942 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:47.786070 master-0 kubenswrapper[13046]: I0308 03:29:47.786011 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 08 03:29:47.786380 master-0 kubenswrapper[13046]: I0308 03:29:47.786322 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 08 03:29:47.786380 master-0 kubenswrapper[13046]: I0308 03:29:47.786365 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 08 03:29:47.791045 master-0 kubenswrapper[13046]: I0308 03:29:47.790991 13046 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 08 03:29:47.795687 master-0 kubenswrapper[13046]: I0308 03:29:47.795604 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:29:47.928167 master-0 kubenswrapper[13046]: I0308 03:29:47.928107 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:47.928520 master-0 kubenswrapper[13046]: I0308 03:29:47.928245 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:47.928520 master-0 kubenswrapper[13046]: I0308 03:29:47.928462 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x82ft\" (UniqueName: \"kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.029752 master-0 kubenswrapper[13046]: I0308 03:29:48.029642 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x82ft\" (UniqueName: \"kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.030058 master-0 kubenswrapper[13046]: I0308 03:29:48.029847 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.030058 master-0 kubenswrapper[13046]: I0308 03:29:48.029936 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.031987 master-0 kubenswrapper[13046]: I0308 03:29:48.031879 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.037325 master-0 kubenswrapper[13046]: I0308 03:29:48.037245 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.060146 master-0 kubenswrapper[13046]: I0308 03:29:48.060051 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x82ft\" (UniqueName: \"kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft\") pod \"sushy-emulator-78f6d7d749-7ll2s\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.120655 master-0 kubenswrapper[13046]: I0308 03:29:48.120580 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:48.596626 master-0 kubenswrapper[13046]: I0308 03:29:48.596546 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:29:48.610841 master-0 kubenswrapper[13046]: W0308 03:29:48.610763 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ecfd98_a9c0_459e_9ba9_b744f9a4b4f8.slice/crio-1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a WatchSource:0}: Error finding container 1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a: Status 404 returned error can't find the container with id 1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a Mar 08 03:29:49.081065 master-0 kubenswrapper[13046]: I0308 03:29:49.080894 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" event={"ID":"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8","Type":"ContainerStarted","Data":"1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a"} Mar 08 03:29:55.126008 master-0 kubenswrapper[13046]: I0308 03:29:55.125948 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" event={"ID":"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8","Type":"ContainerStarted","Data":"a8b4be0d1f53cf129f5d0218c0e9b67e67e55642130383a9a90503d5b655aa27"} Mar 08 03:29:58.132076 master-0 kubenswrapper[13046]: I0308 03:29:58.131984 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:58.132076 master-0 kubenswrapper[13046]: I0308 03:29:58.132062 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:58.137073 master-0 kubenswrapper[13046]: I0308 03:29:58.136973 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:58.158714 master-0 kubenswrapper[13046]: I0308 03:29:58.158662 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:29:58.166092 master-0 kubenswrapper[13046]: I0308 03:29:58.165973 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" podStartSLOduration=5.018430006 podStartE2EDuration="11.165911786s" podCreationTimestamp="2026-03-08 03:29:47 +0000 UTC" firstStartedPulling="2026-03-08 03:29:48.615293093 +0000 UTC m=+990.694060350" lastFinishedPulling="2026-03-08 03:29:54.762774883 +0000 UTC m=+996.841542130" observedRunningTime="2026-03-08 03:29:55.155392993 +0000 UTC m=+997.234160230" watchObservedRunningTime="2026-03-08 03:29:58.165911786 +0000 UTC m=+1000.244679043" Mar 08 03:30:16.714525 master-0 kubenswrapper[13046]: I0308 03:30:16.714266 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-55455b79df-bpk6k"] Mar 08 03:30:16.719967 master-0 kubenswrapper[13046]: I0308 03:30:16.719766 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:16.729941 master-0 kubenswrapper[13046]: I0308 03:30:16.729842 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-55455b79df-bpk6k"] Mar 08 03:30:16.879877 master-0 kubenswrapper[13046]: I0308 03:30:16.879799 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33efb176-131c-43ae-9a6c-bc4acb97bff6-os-client-config\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:16.880205 master-0 kubenswrapper[13046]: I0308 03:30:16.879895 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp75k\" (UniqueName: \"kubernetes.io/projected/33efb176-131c-43ae-9a6c-bc4acb97bff6-kube-api-access-fp75k\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:16.982115 master-0 kubenswrapper[13046]: I0308 03:30:16.981927 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33efb176-131c-43ae-9a6c-bc4acb97bff6-os-client-config\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:16.982115 master-0 kubenswrapper[13046]: I0308 03:30:16.982066 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp75k\" (UniqueName: \"kubernetes.io/projected/33efb176-131c-43ae-9a6c-bc4acb97bff6-kube-api-access-fp75k\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:16.987217 master-0 kubenswrapper[13046]: I0308 03:30:16.987152 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33efb176-131c-43ae-9a6c-bc4acb97bff6-os-client-config\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:17.011499 master-0 kubenswrapper[13046]: I0308 03:30:17.011419 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp75k\" (UniqueName: \"kubernetes.io/projected/33efb176-131c-43ae-9a6c-bc4acb97bff6-kube-api-access-fp75k\") pod \"nova-console-poller-55455b79df-bpk6k\" (UID: \"33efb176-131c-43ae-9a6c-bc4acb97bff6\") " pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:17.043584 master-0 kubenswrapper[13046]: I0308 03:30:17.043452 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" Mar 08 03:30:17.512940 master-0 kubenswrapper[13046]: I0308 03:30:17.512841 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-55455b79df-bpk6k"] Mar 08 03:30:18.355751 master-0 kubenswrapper[13046]: I0308 03:30:18.355678 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" event={"ID":"33efb176-131c-43ae-9a6c-bc4acb97bff6","Type":"ContainerStarted","Data":"099091fab39bb265e94ccce8857564d4811e77772801148a6fd1953cc447a9f9"} Mar 08 03:30:18.713077 master-0 kubenswrapper[13046]: E0308 03:30:18.712959 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff: no such file or directory, extraDiskErr: Mar 08 03:30:22.556624 master-0 kubenswrapper[13046]: I0308 03:30:22.554724 13046 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:30:23.405866 master-0 kubenswrapper[13046]: I0308 03:30:23.405797 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" event={"ID":"33efb176-131c-43ae-9a6c-bc4acb97bff6","Type":"ContainerStarted","Data":"c633b376494bbecbf6cdd0f0b2a98c9e1a1afcd90ea900b3fe634b11765fe281"} Mar 08 03:30:23.405866 master-0 kubenswrapper[13046]: I0308 03:30:23.405840 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" event={"ID":"33efb176-131c-43ae-9a6c-bc4acb97bff6","Type":"ContainerStarted","Data":"dd22cb4f36117237d5b82b333415036bde606adfdc4cf3bf69c9567c0987af94"} Mar 08 03:30:23.434770 master-0 kubenswrapper[13046]: I0308 03:30:23.434661 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-55455b79df-bpk6k" podStartSLOduration=1.947568454 podStartE2EDuration="7.434631809s" podCreationTimestamp="2026-03-08 03:30:16 +0000 UTC" firstStartedPulling="2026-03-08 03:30:17.516515928 +0000 UTC m=+1019.595283185" lastFinishedPulling="2026-03-08 03:30:23.003579313 +0000 UTC m=+1025.082346540" observedRunningTime="2026-03-08 03:30:23.430035339 +0000 UTC m=+1025.508802556" watchObservedRunningTime="2026-03-08 03:30:23.434631809 +0000 UTC m=+1025.513399066" Mar 08 03:30:49.299522 master-0 kubenswrapper[13046]: I0308 03:30:49.295864 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-584784957c-dbkrm"] Mar 08 03:30:49.299522 master-0 kubenswrapper[13046]: I0308 03:30:49.296928 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.315509 master-0 kubenswrapper[13046]: I0308 03:30:49.312448 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-584784957c-dbkrm"] Mar 08 03:30:49.403741 master-0 kubenswrapper[13046]: I0308 03:30:49.403686 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-nova-console-recordings-pv\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.403987 master-0 kubenswrapper[13046]: I0308 03:30:49.403764 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4cg\" (UniqueName: \"kubernetes.io/projected/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-kube-api-access-pn4cg\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.403987 master-0 kubenswrapper[13046]: I0308 03:30:49.403821 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-os-client-config\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.505357 master-0 kubenswrapper[13046]: I0308 03:30:49.505272 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-nova-console-recordings-pv\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.505659 master-0 kubenswrapper[13046]: I0308 03:30:49.505457 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4cg\" (UniqueName: \"kubernetes.io/projected/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-kube-api-access-pn4cg\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.505720 master-0 kubenswrapper[13046]: I0308 03:30:49.505684 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-os-client-config\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.517643 master-0 kubenswrapper[13046]: I0308 03:30:49.511303 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-os-client-config\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:49.539678 master-0 kubenswrapper[13046]: I0308 03:30:49.536036 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4cg\" (UniqueName: \"kubernetes.io/projected/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-kube-api-access-pn4cg\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:50.223835 master-0 kubenswrapper[13046]: I0308 03:30:50.223731 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/bc89cbbc-86be-411b-938b-edb1a1b4c4dd-nova-console-recordings-pv\") pod \"nova-console-recorder-584784957c-dbkrm\" (UID: \"bc89cbbc-86be-411b-938b-edb1a1b4c4dd\") " pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:50.253704 master-0 kubenswrapper[13046]: I0308 03:30:50.253595 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" Mar 08 03:30:50.827167 master-0 kubenswrapper[13046]: W0308 03:30:50.827104 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc89cbbc_86be_411b_938b_edb1a1b4c4dd.slice/crio-5f8b13d6c4ddedf71d6210a1e1b649c295ad9284d221549bca38adf488596840 WatchSource:0}: Error finding container 5f8b13d6c4ddedf71d6210a1e1b649c295ad9284d221549bca38adf488596840: Status 404 returned error can't find the container with id 5f8b13d6c4ddedf71d6210a1e1b649c295ad9284d221549bca38adf488596840 Mar 08 03:30:50.841923 master-0 kubenswrapper[13046]: I0308 03:30:50.840377 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-584784957c-dbkrm"] Mar 08 03:30:50.986886 master-0 kubenswrapper[13046]: I0308 03:30:50.986788 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" event={"ID":"bc89cbbc-86be-411b-938b-edb1a1b4c4dd","Type":"ContainerStarted","Data":"5f8b13d6c4ddedf71d6210a1e1b649c295ad9284d221549bca38adf488596840"} Mar 08 03:30:59.085650 master-0 kubenswrapper[13046]: I0308 03:30:59.085558 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" event={"ID":"bc89cbbc-86be-411b-938b-edb1a1b4c4dd","Type":"ContainerStarted","Data":"ba2113b6feb11f85edf0aa873f722a57b82258c1bebc2397ffea872d222f8481"} Mar 08 03:31:00.099527 master-0 kubenswrapper[13046]: I0308 03:31:00.099396 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" event={"ID":"bc89cbbc-86be-411b-938b-edb1a1b4c4dd","Type":"ContainerStarted","Data":"2ede2c264e80ab690a59ff2dfcdc33e2462021bde045baa478b75d3301e060df"} Mar 08 03:31:00.124362 master-0 kubenswrapper[13046]: I0308 03:31:00.124258 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-584784957c-dbkrm" podStartSLOduration=2.835412623 podStartE2EDuration="11.124236361s" podCreationTimestamp="2026-03-08 03:30:49 +0000 UTC" firstStartedPulling="2026-03-08 03:30:50.829998352 +0000 UTC m=+1052.908765609" lastFinishedPulling="2026-03-08 03:30:59.11882213 +0000 UTC m=+1061.197589347" observedRunningTime="2026-03-08 03:31:00.119988181 +0000 UTC m=+1062.198755388" watchObservedRunningTime="2026-03-08 03:31:00.124236361 +0000 UTC m=+1062.203003588" Mar 08 03:31:18.862753 master-0 kubenswrapper[13046]: I0308 03:31:18.862678 13046 scope.go:117] "RemoveContainer" containerID="35834a4aa2c1aafe2e80cffe71b4934a09d612026d02ceb8d478e6578d08c89b" Mar 08 03:31:18.887474 master-0 kubenswrapper[13046]: I0308 03:31:18.887430 13046 scope.go:117] "RemoveContainer" containerID="b162d8349d83460cb664c5872e401282175cb86df3f0012fb7fce29a941e6bca" Mar 08 03:31:18.914244 master-0 kubenswrapper[13046]: I0308 03:31:18.914146 13046 scope.go:117] "RemoveContainer" containerID="85d08f576755b1f4982d207073aca843243efd692daf52094875911a38bb8b2f" Mar 08 03:31:18.937510 master-0 kubenswrapper[13046]: I0308 03:31:18.937434 13046 scope.go:117] "RemoveContainer" containerID="33c664004c9c37147e8697c5db7face7d0516f1bc99f1c6f6db18ef4b8f7bcd8" Mar 08 03:31:28.784859 master-0 kubenswrapper[13046]: I0308 03:31:28.784733 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl"] Mar 08 03:31:28.786349 master-0 kubenswrapper[13046]: I0308 03:31:28.786317 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.791044 master-0 kubenswrapper[13046]: I0308 03:31:28.788549 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-dsn45" Mar 08 03:31:28.808526 master-0 kubenswrapper[13046]: I0308 03:31:28.807046 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl"] Mar 08 03:31:28.853080 master-0 kubenswrapper[13046]: I0308 03:31:28.853024 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglgn\" (UniqueName: \"kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.853294 master-0 kubenswrapper[13046]: I0308 03:31:28.853151 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.853294 master-0 kubenswrapper[13046]: I0308 03:31:28.853198 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.955330 master-0 kubenswrapper[13046]: I0308 03:31:28.955255 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.955330 master-0 kubenswrapper[13046]: I0308 03:31:28.955328 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.955683 master-0 kubenswrapper[13046]: I0308 03:31:28.955625 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglgn\" (UniqueName: \"kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.956077 master-0 kubenswrapper[13046]: I0308 03:31:28.956041 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.956077 master-0 kubenswrapper[13046]: I0308 03:31:28.956038 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:28.972541 master-0 kubenswrapper[13046]: I0308 03:31:28.972190 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglgn\" (UniqueName: \"kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:29.102559 master-0 kubenswrapper[13046]: I0308 03:31:29.102520 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:29.546228 master-0 kubenswrapper[13046]: I0308 03:31:29.546147 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl"] Mar 08 03:31:30.407010 master-0 kubenswrapper[13046]: I0308 03:31:30.406950 13046 generic.go:334] "Generic (PLEG): container finished" podID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerID="514f80e3475a4fd5433486e72463e8fd7d1ce0f3d1da45a411130ef17249a0ad" exitCode=0 Mar 08 03:31:30.407812 master-0 kubenswrapper[13046]: I0308 03:31:30.407006 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" event={"ID":"9e71f7ae-cfaf-4174-8968-2211f97c9b1f","Type":"ContainerDied","Data":"514f80e3475a4fd5433486e72463e8fd7d1ce0f3d1da45a411130ef17249a0ad"} Mar 08 03:31:30.407812 master-0 kubenswrapper[13046]: I0308 03:31:30.407072 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" event={"ID":"9e71f7ae-cfaf-4174-8968-2211f97c9b1f","Type":"ContainerStarted","Data":"f85b7543d9f0286c70bdcc8f9f603676b13fc58a8d3e302456c0c93c6fec7eea"} Mar 08 03:31:32.431521 master-0 kubenswrapper[13046]: I0308 03:31:32.431427 13046 generic.go:334] "Generic (PLEG): container finished" podID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerID="e3fd3027e49738572bbba053f5cbc74db7ddfa26f798acfe2fde5f5d9a241b53" exitCode=0 Mar 08 03:31:32.432131 master-0 kubenswrapper[13046]: I0308 03:31:32.431531 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" event={"ID":"9e71f7ae-cfaf-4174-8968-2211f97c9b1f","Type":"ContainerDied","Data":"e3fd3027e49738572bbba053f5cbc74db7ddfa26f798acfe2fde5f5d9a241b53"} Mar 08 03:31:33.446104 master-0 kubenswrapper[13046]: I0308 03:31:33.446034 13046 generic.go:334] "Generic (PLEG): container finished" podID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerID="8ceb67f73957cabe2a2f319675e1dbcf5b904c132d0049b5ffc1d6021f5a6101" exitCode=0 Mar 08 03:31:33.447052 master-0 kubenswrapper[13046]: I0308 03:31:33.446159 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" event={"ID":"9e71f7ae-cfaf-4174-8968-2211f97c9b1f","Type":"ContainerDied","Data":"8ceb67f73957cabe2a2f319675e1dbcf5b904c132d0049b5ffc1d6021f5a6101"} Mar 08 03:31:34.878998 master-0 kubenswrapper[13046]: I0308 03:31:34.878876 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:34.974245 master-0 kubenswrapper[13046]: I0308 03:31:34.973144 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle\") pod \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " Mar 08 03:31:34.974245 master-0 kubenswrapper[13046]: I0308 03:31:34.973222 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util\") pod \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " Mar 08 03:31:34.974245 master-0 kubenswrapper[13046]: I0308 03:31:34.973327 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sglgn\" (UniqueName: \"kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn\") pod \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\" (UID: \"9e71f7ae-cfaf-4174-8968-2211f97c9b1f\") " Mar 08 03:31:34.975001 master-0 kubenswrapper[13046]: I0308 03:31:34.974279 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle" (OuterVolumeSpecName: "bundle") pod "9e71f7ae-cfaf-4174-8968-2211f97c9b1f" (UID: "9e71f7ae-cfaf-4174-8968-2211f97c9b1f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:31:34.986417 master-0 kubenswrapper[13046]: I0308 03:31:34.982718 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn" (OuterVolumeSpecName: "kube-api-access-sglgn") pod "9e71f7ae-cfaf-4174-8968-2211f97c9b1f" (UID: "9e71f7ae-cfaf-4174-8968-2211f97c9b1f"). InnerVolumeSpecName "kube-api-access-sglgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:31:34.999517 master-0 kubenswrapper[13046]: I0308 03:31:34.999439 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util" (OuterVolumeSpecName: "util") pod "9e71f7ae-cfaf-4174-8968-2211f97c9b1f" (UID: "9e71f7ae-cfaf-4174-8968-2211f97c9b1f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:31:35.075179 master-0 kubenswrapper[13046]: I0308 03:31:35.075044 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:31:35.075179 master-0 kubenswrapper[13046]: I0308 03:31:35.075107 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:31:35.075179 master-0 kubenswrapper[13046]: I0308 03:31:35.075131 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sglgn\" (UniqueName: \"kubernetes.io/projected/9e71f7ae-cfaf-4174-8968-2211f97c9b1f-kube-api-access-sglgn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:31:35.478565 master-0 kubenswrapper[13046]: I0308 03:31:35.477187 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" event={"ID":"9e71f7ae-cfaf-4174-8968-2211f97c9b1f","Type":"ContainerDied","Data":"f85b7543d9f0286c70bdcc8f9f603676b13fc58a8d3e302456c0c93c6fec7eea"} Mar 08 03:31:35.478565 master-0 kubenswrapper[13046]: I0308 03:31:35.477250 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f85b7543d9f0286c70bdcc8f9f603676b13fc58a8d3e302456c0c93c6fec7eea" Mar 08 03:31:35.478565 master-0 kubenswrapper[13046]: I0308 03:31:35.477350 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d49ckbl" Mar 08 03:31:41.980600 master-0 kubenswrapper[13046]: I0308 03:31:41.974608 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-5b5fb85dc8-ft554"] Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: E0308 03:31:41.980942 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="util" Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: I0308 03:31:41.980964 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="util" Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: E0308 03:31:41.980989 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="extract" Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: I0308 03:31:41.980996 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="extract" Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: E0308 03:31:41.981008 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="pull" Mar 08 03:31:41.981075 master-0 kubenswrapper[13046]: I0308 03:31:41.981016 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="pull" Mar 08 03:31:41.981284 master-0 kubenswrapper[13046]: I0308 03:31:41.981184 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e71f7ae-cfaf-4174-8968-2211f97c9b1f" containerName="extract" Mar 08 03:31:41.981878 master-0 kubenswrapper[13046]: I0308 03:31:41.981845 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:41.986042 master-0 kubenswrapper[13046]: I0308 03:31:41.985997 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 08 03:31:41.986042 master-0 kubenswrapper[13046]: I0308 03:31:41.986038 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 08 03:31:41.986269 master-0 kubenswrapper[13046]: I0308 03:31:41.985997 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 08 03:31:41.986269 master-0 kubenswrapper[13046]: I0308 03:31:41.986245 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 08 03:31:41.986443 master-0 kubenswrapper[13046]: I0308 03:31:41.986288 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 08 03:31:42.002423 master-0 kubenswrapper[13046]: I0308 03:31:41.997434 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-5b5fb85dc8-ft554"] Mar 08 03:31:42.104304 master-0 kubenswrapper[13046]: I0308 03:31:42.104235 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-metrics-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.104513 master-0 kubenswrapper[13046]: I0308 03:31:42.104319 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef58eeba-80e7-46f6-adf9-5469d699f2e3-socket-dir\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.104513 master-0 kubenswrapper[13046]: I0308 03:31:42.104423 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-webhook-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.104668 master-0 kubenswrapper[13046]: I0308 03:31:42.104627 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-apiservice-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.104770 master-0 kubenswrapper[13046]: I0308 03:31:42.104752 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z8js\" (UniqueName: \"kubernetes.io/projected/ef58eeba-80e7-46f6-adf9-5469d699f2e3-kube-api-access-6z8js\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.206107 master-0 kubenswrapper[13046]: I0308 03:31:42.206051 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-apiservice-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.206371 master-0 kubenswrapper[13046]: I0308 03:31:42.206320 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z8js\" (UniqueName: \"kubernetes.io/projected/ef58eeba-80e7-46f6-adf9-5469d699f2e3-kube-api-access-6z8js\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.206439 master-0 kubenswrapper[13046]: I0308 03:31:42.206408 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-metrics-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.206657 master-0 kubenswrapper[13046]: I0308 03:31:42.206635 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef58eeba-80e7-46f6-adf9-5469d699f2e3-socket-dir\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.206736 master-0 kubenswrapper[13046]: I0308 03:31:42.206715 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-webhook-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.207122 master-0 kubenswrapper[13046]: I0308 03:31:42.207083 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef58eeba-80e7-46f6-adf9-5469d699f2e3-socket-dir\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.210349 master-0 kubenswrapper[13046]: I0308 03:31:42.210316 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-apiservice-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.213591 master-0 kubenswrapper[13046]: I0308 03:31:42.212270 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-webhook-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.213591 master-0 kubenswrapper[13046]: I0308 03:31:42.212466 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ef58eeba-80e7-46f6-adf9-5469d699f2e3-metrics-cert\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.223755 master-0 kubenswrapper[13046]: I0308 03:31:42.223705 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z8js\" (UniqueName: \"kubernetes.io/projected/ef58eeba-80e7-46f6-adf9-5469d699f2e3-kube-api-access-6z8js\") pod \"lvms-operator-5b5fb85dc8-ft554\" (UID: \"ef58eeba-80e7-46f6-adf9-5469d699f2e3\") " pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.324091 master-0 kubenswrapper[13046]: I0308 03:31:42.323953 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:42.789041 master-0 kubenswrapper[13046]: I0308 03:31:42.788993 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-5b5fb85dc8-ft554"] Mar 08 03:31:42.794700 master-0 kubenswrapper[13046]: W0308 03:31:42.794660 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef58eeba_80e7_46f6_adf9_5469d699f2e3.slice/crio-cfae5de256f9268e8de5152cd93d259f95dc07fa992cd53562b795bc1989621e WatchSource:0}: Error finding container cfae5de256f9268e8de5152cd93d259f95dc07fa992cd53562b795bc1989621e: Status 404 returned error can't find the container with id cfae5de256f9268e8de5152cd93d259f95dc07fa992cd53562b795bc1989621e Mar 08 03:31:43.555505 master-0 kubenswrapper[13046]: I0308 03:31:43.554939 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" event={"ID":"ef58eeba-80e7-46f6-adf9-5469d699f2e3","Type":"ContainerStarted","Data":"cfae5de256f9268e8de5152cd93d259f95dc07fa992cd53562b795bc1989621e"} Mar 08 03:31:48.608311 master-0 kubenswrapper[13046]: I0308 03:31:48.608213 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" event={"ID":"ef58eeba-80e7-46f6-adf9-5469d699f2e3","Type":"ContainerStarted","Data":"d98ad253f4679b6a741960096dedc7dc97ea197d1d48ebf19062b578a0ed0452"} Mar 08 03:31:48.609282 master-0 kubenswrapper[13046]: I0308 03:31:48.608818 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:48.614523 master-0 kubenswrapper[13046]: I0308 03:31:48.614428 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" Mar 08 03:31:48.657249 master-0 kubenswrapper[13046]: I0308 03:31:48.657156 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-5b5fb85dc8-ft554" podStartSLOduration=2.632571906 podStartE2EDuration="7.657125646s" podCreationTimestamp="2026-03-08 03:31:41 +0000 UTC" firstStartedPulling="2026-03-08 03:31:42.803856151 +0000 UTC m=+1104.882623368" lastFinishedPulling="2026-03-08 03:31:47.828409891 +0000 UTC m=+1109.907177108" observedRunningTime="2026-03-08 03:31:48.644280343 +0000 UTC m=+1110.723047600" watchObservedRunningTime="2026-03-08 03:31:48.657125646 +0000 UTC m=+1110.735892904" Mar 08 03:31:52.546127 master-0 kubenswrapper[13046]: I0308 03:31:52.546053 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b"] Mar 08 03:31:52.548877 master-0 kubenswrapper[13046]: I0308 03:31:52.548826 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.551054 master-0 kubenswrapper[13046]: I0308 03:31:52.551001 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-dsn45" Mar 08 03:31:52.557994 master-0 kubenswrapper[13046]: I0308 03:31:52.557947 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b"] Mar 08 03:31:52.702131 master-0 kubenswrapper[13046]: I0308 03:31:52.701949 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.702547 master-0 kubenswrapper[13046]: I0308 03:31:52.702478 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.702852 master-0 kubenswrapper[13046]: I0308 03:31:52.702802 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwsm\" (UniqueName: \"kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.804437 master-0 kubenswrapper[13046]: I0308 03:31:52.804299 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.804927 master-0 kubenswrapper[13046]: I0308 03:31:52.804891 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.805176 master-0 kubenswrapper[13046]: I0308 03:31:52.805122 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwsm\" (UniqueName: \"kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.805473 master-0 kubenswrapper[13046]: I0308 03:31:52.804932 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.805473 master-0 kubenswrapper[13046]: I0308 03:31:52.805223 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.823206 master-0 kubenswrapper[13046]: I0308 03:31:52.823130 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwsm\" (UniqueName: \"kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:52.867747 master-0 kubenswrapper[13046]: I0308 03:31:52.867640 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:31:53.731935 master-0 kubenswrapper[13046]: I0308 03:31:53.731859 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b"] Mar 08 03:31:53.735877 master-0 kubenswrapper[13046]: W0308 03:31:53.735709 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e67e80_7f34_4f9a_897a_7ef14440a56e.slice/crio-791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4 WatchSource:0}: Error finding container 791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4: Status 404 returned error can't find the container with id 791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4 Mar 08 03:31:54.655553 master-0 kubenswrapper[13046]: I0308 03:31:54.655508 13046 generic.go:334] "Generic (PLEG): container finished" podID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerID="99a5f71e480ef9e7b018b85b104761069a44dfb81efac73d76a6d36de47d7d23" exitCode=0 Mar 08 03:31:54.655553 master-0 kubenswrapper[13046]: I0308 03:31:54.655552 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" event={"ID":"53e67e80-7f34-4f9a-897a-7ef14440a56e","Type":"ContainerDied","Data":"99a5f71e480ef9e7b018b85b104761069a44dfb81efac73d76a6d36de47d7d23"} Mar 08 03:31:54.655827 master-0 kubenswrapper[13046]: I0308 03:31:54.655576 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" event={"ID":"53e67e80-7f34-4f9a-897a-7ef14440a56e","Type":"ContainerStarted","Data":"791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4"} Mar 08 03:31:54.956609 master-0 kubenswrapper[13046]: I0308 03:31:54.955350 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh"] Mar 08 03:31:54.958027 master-0 kubenswrapper[13046]: I0308 03:31:54.957960 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:54.969998 master-0 kubenswrapper[13046]: I0308 03:31:54.969236 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh"] Mar 08 03:31:55.062575 master-0 kubenswrapper[13046]: I0308 03:31:55.062316 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.062575 master-0 kubenswrapper[13046]: I0308 03:31:55.062451 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49b48\" (UniqueName: \"kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.063535 master-0 kubenswrapper[13046]: I0308 03:31:55.063460 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.164932 master-0 kubenswrapper[13046]: I0308 03:31:55.164856 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.165126 master-0 kubenswrapper[13046]: I0308 03:31:55.164943 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49b48\" (UniqueName: \"kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.165292 master-0 kubenswrapper[13046]: I0308 03:31:55.165253 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.165873 master-0 kubenswrapper[13046]: I0308 03:31:55.165791 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.166060 master-0 kubenswrapper[13046]: I0308 03:31:55.166010 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.193593 master-0 kubenswrapper[13046]: I0308 03:31:55.193521 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49b48\" (UniqueName: \"kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.279980 master-0 kubenswrapper[13046]: I0308 03:31:55.279802 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:31:55.399940 master-0 kubenswrapper[13046]: I0308 03:31:55.399843 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds"] Mar 08 03:31:55.401520 master-0 kubenswrapper[13046]: I0308 03:31:55.401459 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds"] Mar 08 03:31:55.401640 master-0 kubenswrapper[13046]: I0308 03:31:55.401612 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.470474 master-0 kubenswrapper[13046]: I0308 03:31:55.470408 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.470702 master-0 kubenswrapper[13046]: I0308 03:31:55.470555 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5g9c\" (UniqueName: \"kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.470702 master-0 kubenswrapper[13046]: I0308 03:31:55.470591 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.573361 master-0 kubenswrapper[13046]: I0308 03:31:55.573243 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.573905 master-0 kubenswrapper[13046]: I0308 03:31:55.573579 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5g9c\" (UniqueName: \"kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.573905 master-0 kubenswrapper[13046]: I0308 03:31:55.573698 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.574073 master-0 kubenswrapper[13046]: I0308 03:31:55.573928 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.574287 master-0 kubenswrapper[13046]: I0308 03:31:55.574245 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.595054 master-0 kubenswrapper[13046]: I0308 03:31:55.594999 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5g9c\" (UniqueName: \"kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.744212 master-0 kubenswrapper[13046]: I0308 03:31:55.744140 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:31:55.818397 master-0 kubenswrapper[13046]: W0308 03:31:55.818332 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd993df0_4411_46e5_8b19_779692a09d01.slice/crio-1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820 WatchSource:0}: Error finding container 1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820: Status 404 returned error can't find the container with id 1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820 Mar 08 03:31:55.818626 master-0 kubenswrapper[13046]: I0308 03:31:55.818513 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh"] Mar 08 03:31:56.230114 master-0 kubenswrapper[13046]: I0308 03:31:56.229063 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds"] Mar 08 03:31:56.681922 master-0 kubenswrapper[13046]: I0308 03:31:56.681849 13046 generic.go:334] "Generic (PLEG): container finished" podID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerID="04df4ccd8751ea4b23424e3ed560866cd9fa845352df42c9c6d44fba416d2b46" exitCode=0 Mar 08 03:31:56.682151 master-0 kubenswrapper[13046]: I0308 03:31:56.681974 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" event={"ID":"4578aaaa-d906-4d09-a9e2-a55dc5839447","Type":"ContainerDied","Data":"04df4ccd8751ea4b23424e3ed560866cd9fa845352df42c9c6d44fba416d2b46"} Mar 08 03:31:56.682151 master-0 kubenswrapper[13046]: I0308 03:31:56.682014 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" event={"ID":"4578aaaa-d906-4d09-a9e2-a55dc5839447","Type":"ContainerStarted","Data":"c1f275deaf30322ddb50bbb6886ad52914f83202a67bf03558bd01a2cb2b05de"} Mar 08 03:31:56.688568 master-0 kubenswrapper[13046]: I0308 03:31:56.685875 13046 generic.go:334] "Generic (PLEG): container finished" podID="cd993df0-4411-46e5-8b19-779692a09d01" containerID="0d40d6e828ff66b1557eca9d19d21064eb5110c58df451e87b75afb2bdf75cd5" exitCode=0 Mar 08 03:31:56.688568 master-0 kubenswrapper[13046]: I0308 03:31:56.685944 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" event={"ID":"cd993df0-4411-46e5-8b19-779692a09d01","Type":"ContainerDied","Data":"0d40d6e828ff66b1557eca9d19d21064eb5110c58df451e87b75afb2bdf75cd5"} Mar 08 03:31:56.688568 master-0 kubenswrapper[13046]: I0308 03:31:56.685973 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" event={"ID":"cd993df0-4411-46e5-8b19-779692a09d01","Type":"ContainerStarted","Data":"1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820"} Mar 08 03:31:58.704062 master-0 kubenswrapper[13046]: I0308 03:31:58.704004 13046 generic.go:334] "Generic (PLEG): container finished" podID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerID="c89b4bc81ded52227f39ad14a4b1f39bbea8bd158e6836449a8f75e3c6f16c96" exitCode=0 Mar 08 03:31:58.704668 master-0 kubenswrapper[13046]: I0308 03:31:58.704069 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" event={"ID":"53e67e80-7f34-4f9a-897a-7ef14440a56e","Type":"ContainerDied","Data":"c89b4bc81ded52227f39ad14a4b1f39bbea8bd158e6836449a8f75e3c6f16c96"} Mar 08 03:31:59.716149 master-0 kubenswrapper[13046]: I0308 03:31:59.714412 13046 generic.go:334] "Generic (PLEG): container finished" podID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerID="2b96775df96b65f9b53b2778d717fc9c7aa05af55241ca0e09a005ab14c3941a" exitCode=0 Mar 08 03:31:59.716149 master-0 kubenswrapper[13046]: I0308 03:31:59.714525 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" event={"ID":"53e67e80-7f34-4f9a-897a-7ef14440a56e","Type":"ContainerDied","Data":"2b96775df96b65f9b53b2778d717fc9c7aa05af55241ca0e09a005ab14c3941a"} Mar 08 03:31:59.717702 master-0 kubenswrapper[13046]: I0308 03:31:59.717451 13046 generic.go:334] "Generic (PLEG): container finished" podID="cd993df0-4411-46e5-8b19-779692a09d01" containerID="7181fd6e21bdcc764dec7efa3b891957926ba00bae4fcaa91c41b7724d1bb8b0" exitCode=0 Mar 08 03:31:59.717702 master-0 kubenswrapper[13046]: I0308 03:31:59.717607 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" event={"ID":"cd993df0-4411-46e5-8b19-779692a09d01","Type":"ContainerDied","Data":"7181fd6e21bdcc764dec7efa3b891957926ba00bae4fcaa91c41b7724d1bb8b0"} Mar 08 03:31:59.720269 master-0 kubenswrapper[13046]: I0308 03:31:59.720069 13046 generic.go:334] "Generic (PLEG): container finished" podID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerID="cea5a6c0c15cb41c843e0b92c3bed16c9dc573195d4d79f28b96bce45676b496" exitCode=0 Mar 08 03:31:59.720269 master-0 kubenswrapper[13046]: I0308 03:31:59.720124 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" event={"ID":"4578aaaa-d906-4d09-a9e2-a55dc5839447","Type":"ContainerDied","Data":"cea5a6c0c15cb41c843e0b92c3bed16c9dc573195d4d79f28b96bce45676b496"} Mar 08 03:32:00.735441 master-0 kubenswrapper[13046]: I0308 03:32:00.735356 13046 generic.go:334] "Generic (PLEG): container finished" podID="cd993df0-4411-46e5-8b19-779692a09d01" containerID="111766c1fa421e1c6e0e9f1dbf3c6455aa1e664e97681e5f1b7d9e835dc2c361" exitCode=0 Mar 08 03:32:00.735441 master-0 kubenswrapper[13046]: I0308 03:32:00.735420 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" event={"ID":"cd993df0-4411-46e5-8b19-779692a09d01","Type":"ContainerDied","Data":"111766c1fa421e1c6e0e9f1dbf3c6455aa1e664e97681e5f1b7d9e835dc2c361"} Mar 08 03:32:00.741227 master-0 kubenswrapper[13046]: I0308 03:32:00.741159 13046 generic.go:334] "Generic (PLEG): container finished" podID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerID="a55496fabe898ede73529fe4ac8ac5685b71bd6b670b928a2f4ade7ebdea6fc4" exitCode=0 Mar 08 03:32:00.741635 master-0 kubenswrapper[13046]: I0308 03:32:00.741574 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" event={"ID":"4578aaaa-d906-4d09-a9e2-a55dc5839447","Type":"ContainerDied","Data":"a55496fabe898ede73529fe4ac8ac5685b71bd6b670b928a2f4ade7ebdea6fc4"} Mar 08 03:32:01.231395 master-0 kubenswrapper[13046]: I0308 03:32:01.231295 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:32:01.383174 master-0 kubenswrapper[13046]: I0308 03:32:01.383044 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle\") pod \"53e67e80-7f34-4f9a-897a-7ef14440a56e\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " Mar 08 03:32:01.383174 master-0 kubenswrapper[13046]: I0308 03:32:01.383151 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util\") pod \"53e67e80-7f34-4f9a-897a-7ef14440a56e\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " Mar 08 03:32:01.383750 master-0 kubenswrapper[13046]: I0308 03:32:01.383275 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggwsm\" (UniqueName: \"kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm\") pod \"53e67e80-7f34-4f9a-897a-7ef14440a56e\" (UID: \"53e67e80-7f34-4f9a-897a-7ef14440a56e\") " Mar 08 03:32:01.384560 master-0 kubenswrapper[13046]: I0308 03:32:01.384434 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle" (OuterVolumeSpecName: "bundle") pod "53e67e80-7f34-4f9a-897a-7ef14440a56e" (UID: "53e67e80-7f34-4f9a-897a-7ef14440a56e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:01.388362 master-0 kubenswrapper[13046]: I0308 03:32:01.388261 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm" (OuterVolumeSpecName: "kube-api-access-ggwsm") pod "53e67e80-7f34-4f9a-897a-7ef14440a56e" (UID: "53e67e80-7f34-4f9a-897a-7ef14440a56e"). InnerVolumeSpecName "kube-api-access-ggwsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:32:01.400146 master-0 kubenswrapper[13046]: I0308 03:32:01.399945 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util" (OuterVolumeSpecName: "util") pod "53e67e80-7f34-4f9a-897a-7ef14440a56e" (UID: "53e67e80-7f34-4f9a-897a-7ef14440a56e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:01.485623 master-0 kubenswrapper[13046]: I0308 03:32:01.485527 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:01.485623 master-0 kubenswrapper[13046]: I0308 03:32:01.485587 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggwsm\" (UniqueName: \"kubernetes.io/projected/53e67e80-7f34-4f9a-897a-7ef14440a56e-kube-api-access-ggwsm\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:01.485623 master-0 kubenswrapper[13046]: I0308 03:32:01.485612 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53e67e80-7f34-4f9a-897a-7ef14440a56e-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:01.763411 master-0 kubenswrapper[13046]: I0308 03:32:01.763168 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" event={"ID":"53e67e80-7f34-4f9a-897a-7ef14440a56e","Type":"ContainerDied","Data":"791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4"} Mar 08 03:32:01.763411 master-0 kubenswrapper[13046]: I0308 03:32:01.763230 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791b7724f5d368c66bacbe7595e21077505394772c99e13659d01bbfe08b1fd4" Mar 08 03:32:01.764721 master-0 kubenswrapper[13046]: I0308 03:32:01.764662 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xb27b" Mar 08 03:32:02.208348 master-0 kubenswrapper[13046]: I0308 03:32:02.208139 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:32:02.289064 master-0 kubenswrapper[13046]: I0308 03:32:02.289024 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:32:02.301136 master-0 kubenswrapper[13046]: I0308 03:32:02.301031 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle\") pod \"4578aaaa-d906-4d09-a9e2-a55dc5839447\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " Mar 08 03:32:02.301214 master-0 kubenswrapper[13046]: I0308 03:32:02.301137 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle\") pod \"cd993df0-4411-46e5-8b19-779692a09d01\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " Mar 08 03:32:02.301214 master-0 kubenswrapper[13046]: I0308 03:32:02.301201 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util\") pod \"cd993df0-4411-46e5-8b19-779692a09d01\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " Mar 08 03:32:02.302504 master-0 kubenswrapper[13046]: I0308 03:32:02.301274 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5g9c\" (UniqueName: \"kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c\") pod \"4578aaaa-d906-4d09-a9e2-a55dc5839447\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " Mar 08 03:32:02.302504 master-0 kubenswrapper[13046]: I0308 03:32:02.301304 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util\") pod \"4578aaaa-d906-4d09-a9e2-a55dc5839447\" (UID: \"4578aaaa-d906-4d09-a9e2-a55dc5839447\") " Mar 08 03:32:02.302504 master-0 kubenswrapper[13046]: I0308 03:32:02.301369 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49b48\" (UniqueName: \"kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48\") pod \"cd993df0-4411-46e5-8b19-779692a09d01\" (UID: \"cd993df0-4411-46e5-8b19-779692a09d01\") " Mar 08 03:32:02.302504 master-0 kubenswrapper[13046]: I0308 03:32:02.301684 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle" (OuterVolumeSpecName: "bundle") pod "4578aaaa-d906-4d09-a9e2-a55dc5839447" (UID: "4578aaaa-d906-4d09-a9e2-a55dc5839447"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:02.302504 master-0 kubenswrapper[13046]: I0308 03:32:02.301917 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.307535 master-0 kubenswrapper[13046]: I0308 03:32:02.306279 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c" (OuterVolumeSpecName: "kube-api-access-v5g9c") pod "4578aaaa-d906-4d09-a9e2-a55dc5839447" (UID: "4578aaaa-d906-4d09-a9e2-a55dc5839447"). InnerVolumeSpecName "kube-api-access-v5g9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:32:02.307535 master-0 kubenswrapper[13046]: I0308 03:32:02.306340 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48" (OuterVolumeSpecName: "kube-api-access-49b48") pod "cd993df0-4411-46e5-8b19-779692a09d01" (UID: "cd993df0-4411-46e5-8b19-779692a09d01"). InnerVolumeSpecName "kube-api-access-49b48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:32:02.312998 master-0 kubenswrapper[13046]: I0308 03:32:02.312962 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util" (OuterVolumeSpecName: "util") pod "4578aaaa-d906-4d09-a9e2-a55dc5839447" (UID: "4578aaaa-d906-4d09-a9e2-a55dc5839447"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:02.313328 master-0 kubenswrapper[13046]: I0308 03:32:02.313278 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util" (OuterVolumeSpecName: "util") pod "cd993df0-4411-46e5-8b19-779692a09d01" (UID: "cd993df0-4411-46e5-8b19-779692a09d01"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:02.314205 master-0 kubenswrapper[13046]: I0308 03:32:02.314171 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle" (OuterVolumeSpecName: "bundle") pod "cd993df0-4411-46e5-8b19-779692a09d01" (UID: "cd993df0-4411-46e5-8b19-779692a09d01"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:02.403132 master-0 kubenswrapper[13046]: I0308 03:32:02.403036 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49b48\" (UniqueName: \"kubernetes.io/projected/cd993df0-4411-46e5-8b19-779692a09d01-kube-api-access-49b48\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.403132 master-0 kubenswrapper[13046]: I0308 03:32:02.403086 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.403132 master-0 kubenswrapper[13046]: I0308 03:32:02.403102 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd993df0-4411-46e5-8b19-779692a09d01-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.403132 master-0 kubenswrapper[13046]: I0308 03:32:02.403115 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5g9c\" (UniqueName: \"kubernetes.io/projected/4578aaaa-d906-4d09-a9e2-a55dc5839447-kube-api-access-v5g9c\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.403132 master-0 kubenswrapper[13046]: I0308 03:32:02.403128 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4578aaaa-d906-4d09-a9e2-a55dc5839447-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:02.567256 master-0 kubenswrapper[13046]: I0308 03:32:02.567086 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb"] Mar 08 03:32:02.567573 master-0 kubenswrapper[13046]: E0308 03:32:02.567469 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="pull" Mar 08 03:32:02.567573 master-0 kubenswrapper[13046]: I0308 03:32:02.567517 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="pull" Mar 08 03:32:02.567573 master-0 kubenswrapper[13046]: E0308 03:32:02.567564 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="extract" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567579 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="extract" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567598 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="pull" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567612 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="pull" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567652 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="extract" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567664 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="extract" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567686 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="util" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567698 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="util" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567717 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="util" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567729 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="util" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567748 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="pull" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: I0308 03:32:02.567781 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="pull" Mar 08 03:32:02.567809 master-0 kubenswrapper[13046]: E0308 03:32:02.567818 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="util" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: I0308 03:32:02.567831 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="util" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: E0308 03:32:02.567862 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="extract" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: I0308 03:32:02.567874 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="extract" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: I0308 03:32:02.568105 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd993df0-4411-46e5-8b19-779692a09d01" containerName="extract" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: I0308 03:32:02.568156 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e67e80-7f34-4f9a-897a-7ef14440a56e" containerName="extract" Mar 08 03:32:02.568585 master-0 kubenswrapper[13046]: I0308 03:32:02.568184 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="4578aaaa-d906-4d09-a9e2-a55dc5839447" containerName="extract" Mar 08 03:32:02.569878 master-0 kubenswrapper[13046]: I0308 03:32:02.569829 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.596612 master-0 kubenswrapper[13046]: I0308 03:32:02.596537 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb"] Mar 08 03:32:02.708868 master-0 kubenswrapper[13046]: I0308 03:32:02.708792 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2x5\" (UniqueName: \"kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.709382 master-0 kubenswrapper[13046]: I0308 03:32:02.709338 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.709686 master-0 kubenswrapper[13046]: I0308 03:32:02.709650 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.774925 master-0 kubenswrapper[13046]: I0308 03:32:02.774864 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" Mar 08 03:32:02.775897 master-0 kubenswrapper[13046]: I0308 03:32:02.774878 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ltqds" event={"ID":"4578aaaa-d906-4d09-a9e2-a55dc5839447","Type":"ContainerDied","Data":"c1f275deaf30322ddb50bbb6886ad52914f83202a67bf03558bd01a2cb2b05de"} Mar 08 03:32:02.776152 master-0 kubenswrapper[13046]: I0308 03:32:02.776123 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f275deaf30322ddb50bbb6886ad52914f83202a67bf03558bd01a2cb2b05de" Mar 08 03:32:02.778632 master-0 kubenswrapper[13046]: I0308 03:32:02.778570 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" event={"ID":"cd993df0-4411-46e5-8b19-779692a09d01","Type":"ContainerDied","Data":"1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820"} Mar 08 03:32:02.778632 master-0 kubenswrapper[13046]: I0308 03:32:02.778634 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e39820b1c5fcd9eb5b0132020b7a542773156a7e2d4b29878ac02babc055820" Mar 08 03:32:02.780020 master-0 kubenswrapper[13046]: I0308 03:32:02.778630 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4nwjnh" Mar 08 03:32:02.811301 master-0 kubenswrapper[13046]: I0308 03:32:02.811230 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2x5\" (UniqueName: \"kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.811473 master-0 kubenswrapper[13046]: I0308 03:32:02.811425 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.811755 master-0 kubenswrapper[13046]: I0308 03:32:02.811505 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.812041 master-0 kubenswrapper[13046]: I0308 03:32:02.811990 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.812338 master-0 kubenswrapper[13046]: I0308 03:32:02.812288 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.831322 master-0 kubenswrapper[13046]: I0308 03:32:02.831193 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2x5\" (UniqueName: \"kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:02.892580 master-0 kubenswrapper[13046]: I0308 03:32:02.892476 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:03.392846 master-0 kubenswrapper[13046]: W0308 03:32:03.392360 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d8bd87_ec6f_4494_b2d1_c09362ec7955.slice/crio-f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320 WatchSource:0}: Error finding container f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320: Status 404 returned error can't find the container with id f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320 Mar 08 03:32:03.395989 master-0 kubenswrapper[13046]: I0308 03:32:03.395925 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb"] Mar 08 03:32:03.789981 master-0 kubenswrapper[13046]: I0308 03:32:03.789914 13046 generic.go:334] "Generic (PLEG): container finished" podID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerID="15809c306ace4e712adf8138e9b0a7abe02d9d6a4873deb11c1e69d585b77238" exitCode=0 Mar 08 03:32:03.790847 master-0 kubenswrapper[13046]: I0308 03:32:03.789972 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerDied","Data":"15809c306ace4e712adf8138e9b0a7abe02d9d6a4873deb11c1e69d585b77238"} Mar 08 03:32:03.790847 master-0 kubenswrapper[13046]: I0308 03:32:03.790048 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerStarted","Data":"f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320"} Mar 08 03:32:05.804277 master-0 kubenswrapper[13046]: I0308 03:32:05.804157 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerStarted","Data":"ee846fda43aabb77652b974ba2a837ea83d8d966d968701f3a275b897d7789ae"} Mar 08 03:32:06.130051 master-0 kubenswrapper[13046]: I0308 03:32:06.129991 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm"] Mar 08 03:32:06.131126 master-0 kubenswrapper[13046]: I0308 03:32:06.131090 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.133015 master-0 kubenswrapper[13046]: I0308 03:32:06.132970 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 03:32:06.133181 master-0 kubenswrapper[13046]: I0308 03:32:06.133138 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 03:32:06.150992 master-0 kubenswrapper[13046]: I0308 03:32:06.150930 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm"] Mar 08 03:32:06.271504 master-0 kubenswrapper[13046]: I0308 03:32:06.270098 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl8x9\" (UniqueName: \"kubernetes.io/projected/82d65021-4cae-42d1-950a-2c52a1ec1729-kube-api-access-tl8x9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.271504 master-0 kubenswrapper[13046]: I0308 03:32:06.270226 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d65021-4cae-42d1-950a-2c52a1ec1729-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.372297 master-0 kubenswrapper[13046]: I0308 03:32:06.372189 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d65021-4cae-42d1-950a-2c52a1ec1729-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.372626 master-0 kubenswrapper[13046]: I0308 03:32:06.372341 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl8x9\" (UniqueName: \"kubernetes.io/projected/82d65021-4cae-42d1-950a-2c52a1ec1729-kube-api-access-tl8x9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.373109 master-0 kubenswrapper[13046]: I0308 03:32:06.373053 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82d65021-4cae-42d1-950a-2c52a1ec1729-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.392713 master-0 kubenswrapper[13046]: I0308 03:32:06.392558 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl8x9\" (UniqueName: \"kubernetes.io/projected/82d65021-4cae-42d1-950a-2c52a1ec1729-kube-api-access-tl8x9\") pod \"cert-manager-operator-controller-manager-66c8bdd694-rgzbm\" (UID: \"82d65021-4cae-42d1-950a-2c52a1ec1729\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.452504 master-0 kubenswrapper[13046]: I0308 03:32:06.452444 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" Mar 08 03:32:06.814732 master-0 kubenswrapper[13046]: I0308 03:32:06.814594 13046 generic.go:334] "Generic (PLEG): container finished" podID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerID="ee846fda43aabb77652b974ba2a837ea83d8d966d968701f3a275b897d7789ae" exitCode=0 Mar 08 03:32:06.814732 master-0 kubenswrapper[13046]: I0308 03:32:06.814662 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerDied","Data":"ee846fda43aabb77652b974ba2a837ea83d8d966d968701f3a275b897d7789ae"} Mar 08 03:32:06.923211 master-0 kubenswrapper[13046]: I0308 03:32:06.923162 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm"] Mar 08 03:32:06.927722 master-0 kubenswrapper[13046]: W0308 03:32:06.927676 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d65021_4cae_42d1_950a_2c52a1ec1729.slice/crio-1542519ff9106adad2f8718b776c737102622f05ee32d3319859a040a2fe10c1 WatchSource:0}: Error finding container 1542519ff9106adad2f8718b776c737102622f05ee32d3319859a040a2fe10c1: Status 404 returned error can't find the container with id 1542519ff9106adad2f8718b776c737102622f05ee32d3319859a040a2fe10c1 Mar 08 03:32:07.825982 master-0 kubenswrapper[13046]: I0308 03:32:07.825787 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" event={"ID":"82d65021-4cae-42d1-950a-2c52a1ec1729","Type":"ContainerStarted","Data":"1542519ff9106adad2f8718b776c737102622f05ee32d3319859a040a2fe10c1"} Mar 08 03:32:07.828400 master-0 kubenswrapper[13046]: I0308 03:32:07.828345 13046 generic.go:334] "Generic (PLEG): container finished" podID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerID="28b98e7af9c0cf8e8deacb601d7ece93844c303adda7ac13ddb07e82958a9ac9" exitCode=0 Mar 08 03:32:07.828620 master-0 kubenswrapper[13046]: I0308 03:32:07.828412 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerDied","Data":"28b98e7af9c0cf8e8deacb601d7ece93844c303adda7ac13ddb07e82958a9ac9"} Mar 08 03:32:10.259470 master-0 kubenswrapper[13046]: I0308 03:32:10.259428 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.357252 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util\") pod \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.357356 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle\") pod \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.357434 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2x5\" (UniqueName: \"kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5\") pod \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\" (UID: \"74d8bd87-ec6f-4494-b2d1-c09362ec7955\") " Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.359000 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle" (OuterVolumeSpecName: "bundle") pod "74d8bd87-ec6f-4494-b2d1-c09362ec7955" (UID: "74d8bd87-ec6f-4494-b2d1-c09362ec7955"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.364635 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5" (OuterVolumeSpecName: "kube-api-access-mq2x5") pod "74d8bd87-ec6f-4494-b2d1-c09362ec7955" (UID: "74d8bd87-ec6f-4494-b2d1-c09362ec7955"). InnerVolumeSpecName "kube-api-access-mq2x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:32:10.367718 master-0 kubenswrapper[13046]: I0308 03:32:10.365872 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util" (OuterVolumeSpecName: "util") pod "74d8bd87-ec6f-4494-b2d1-c09362ec7955" (UID: "74d8bd87-ec6f-4494-b2d1-c09362ec7955"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:32:10.459752 master-0 kubenswrapper[13046]: I0308 03:32:10.459706 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:10.460071 master-0 kubenswrapper[13046]: I0308 03:32:10.459768 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74d8bd87-ec6f-4494-b2d1-c09362ec7955-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:10.460071 master-0 kubenswrapper[13046]: I0308 03:32:10.459787 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2x5\" (UniqueName: \"kubernetes.io/projected/74d8bd87-ec6f-4494-b2d1-c09362ec7955-kube-api-access-mq2x5\") on node \"master-0\" DevicePath \"\"" Mar 08 03:32:10.857894 master-0 kubenswrapper[13046]: I0308 03:32:10.857548 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" event={"ID":"74d8bd87-ec6f-4494-b2d1-c09362ec7955","Type":"ContainerDied","Data":"f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320"} Mar 08 03:32:10.857894 master-0 kubenswrapper[13046]: I0308 03:32:10.857616 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ac0477d5da95d3f60b78adc0a36042c4dfca6cc503c1a21baef173108a7320" Mar 08 03:32:10.857894 master-0 kubenswrapper[13046]: I0308 03:32:10.857719 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08hvctb" Mar 08 03:32:10.863496 master-0 kubenswrapper[13046]: I0308 03:32:10.860561 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" event={"ID":"82d65021-4cae-42d1-950a-2c52a1ec1729","Type":"ContainerStarted","Data":"c055fc88c8c658c0778fb9b1d4b036d38a87e5d6d63d1e956fdce72918f0f8ae"} Mar 08 03:32:10.902496 master-0 kubenswrapper[13046]: I0308 03:32:10.900128 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-rgzbm" podStartSLOduration=1.508596071 podStartE2EDuration="4.90010553s" podCreationTimestamp="2026-03-08 03:32:06 +0000 UTC" firstStartedPulling="2026-03-08 03:32:06.934041683 +0000 UTC m=+1129.012808900" lastFinishedPulling="2026-03-08 03:32:10.325551142 +0000 UTC m=+1132.404318359" observedRunningTime="2026-03-08 03:32:10.887661248 +0000 UTC m=+1132.966428465" watchObservedRunningTime="2026-03-08 03:32:10.90010553 +0000 UTC m=+1132.978872747" Mar 08 03:32:10.974504 master-0 kubenswrapper[13046]: E0308 03:32:10.973914 13046 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d8bd87_ec6f_4494_b2d1_c09362ec7955.slice\": RecentStats: unable to find data in memory cache]" Mar 08 03:32:17.672825 master-0 kubenswrapper[13046]: I0308 03:32:17.672761 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-kbtxq"] Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: E0308 03:32:17.673073 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="pull" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: I0308 03:32:17.673089 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="pull" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: E0308 03:32:17.673107 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="extract" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: I0308 03:32:17.673115 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="extract" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: E0308 03:32:17.673148 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="util" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: I0308 03:32:17.673156 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="util" Mar 08 03:32:17.673664 master-0 kubenswrapper[13046]: I0308 03:32:17.673345 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d8bd87-ec6f-4494-b2d1-c09362ec7955" containerName="extract" Mar 08 03:32:17.674126 master-0 kubenswrapper[13046]: I0308 03:32:17.673893 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.681394 master-0 kubenswrapper[13046]: I0308 03:32:17.681331 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 03:32:17.681394 master-0 kubenswrapper[13046]: I0308 03:32:17.681388 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 03:32:17.751381 master-0 kubenswrapper[13046]: I0308 03:32:17.751317 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-kbtxq"] Mar 08 03:32:17.787380 master-0 kubenswrapper[13046]: I0308 03:32:17.787137 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.787380 master-0 kubenswrapper[13046]: I0308 03:32:17.787298 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nssnk\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-kube-api-access-nssnk\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.888373 master-0 kubenswrapper[13046]: I0308 03:32:17.888290 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.888682 master-0 kubenswrapper[13046]: I0308 03:32:17.888420 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nssnk\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-kube-api-access-nssnk\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.903846 master-0 kubenswrapper[13046]: I0308 03:32:17.903733 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.909200 master-0 kubenswrapper[13046]: I0308 03:32:17.909155 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nssnk\" (UniqueName: \"kubernetes.io/projected/901728e5-edfb-4a63-a5bf-62c26ee8e926-kube-api-access-nssnk\") pod \"cert-manager-cainjector-5545bd876-kbtxq\" (UID: \"901728e5-edfb-4a63-a5bf-62c26ee8e926\") " pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:17.998539 master-0 kubenswrapper[13046]: I0308 03:32:17.998406 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" Mar 08 03:32:18.043338 master-0 kubenswrapper[13046]: I0308 03:32:18.043231 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq"] Mar 08 03:32:18.047615 master-0 kubenswrapper[13046]: I0308 03:32:18.047517 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" Mar 08 03:32:18.054337 master-0 kubenswrapper[13046]: I0308 03:32:18.054284 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq"] Mar 08 03:32:18.056536 master-0 kubenswrapper[13046]: I0308 03:32:18.055813 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 03:32:18.056536 master-0 kubenswrapper[13046]: I0308 03:32:18.055895 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 03:32:18.194436 master-0 kubenswrapper[13046]: I0308 03:32:18.194385 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5b7b\" (UniqueName: \"kubernetes.io/projected/a123bab3-dc4d-42e5-a156-8dc6c3612334-kube-api-access-v5b7b\") pod \"nmstate-operator-75c5dccd6c-vhkqq\" (UID: \"a123bab3-dc4d-42e5-a156-8dc6c3612334\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" Mar 08 03:32:18.295844 master-0 kubenswrapper[13046]: I0308 03:32:18.295724 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5b7b\" (UniqueName: \"kubernetes.io/projected/a123bab3-dc4d-42e5-a156-8dc6c3612334-kube-api-access-v5b7b\") pod \"nmstate-operator-75c5dccd6c-vhkqq\" (UID: \"a123bab3-dc4d-42e5-a156-8dc6c3612334\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" Mar 08 03:32:18.309863 master-0 kubenswrapper[13046]: I0308 03:32:18.309814 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 03:32:18.320376 master-0 kubenswrapper[13046]: I0308 03:32:18.320327 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 03:32:18.334883 master-0 kubenswrapper[13046]: I0308 03:32:18.334825 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5b7b\" (UniqueName: \"kubernetes.io/projected/a123bab3-dc4d-42e5-a156-8dc6c3612334-kube-api-access-v5b7b\") pod \"nmstate-operator-75c5dccd6c-vhkqq\" (UID: \"a123bab3-dc4d-42e5-a156-8dc6c3612334\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" Mar 08 03:32:18.407557 master-0 kubenswrapper[13046]: I0308 03:32:18.406850 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" Mar 08 03:32:18.504462 master-0 kubenswrapper[13046]: I0308 03:32:18.502927 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-kbtxq"] Mar 08 03:32:18.916706 master-0 kubenswrapper[13046]: I0308 03:32:18.916535 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" event={"ID":"901728e5-edfb-4a63-a5bf-62c26ee8e926","Type":"ContainerStarted","Data":"5015881c65efe1d9b10927cc63f6578e931d0d9fa43c04810f1a146b4f5a63c3"} Mar 08 03:32:18.959544 master-0 kubenswrapper[13046]: I0308 03:32:18.959446 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq"] Mar 08 03:32:19.926036 master-0 kubenswrapper[13046]: I0308 03:32:19.925953 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" event={"ID":"a123bab3-dc4d-42e5-a156-8dc6c3612334","Type":"ContainerStarted","Data":"bf7bfff72ff31c17d715357573383cc3fd8ef13c5ffd52db6c676aeeb6bdb583"} Mar 08 03:32:24.660394 master-0 kubenswrapper[13046]: I0308 03:32:24.660348 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-m5cdg"] Mar 08 03:32:24.661741 master-0 kubenswrapper[13046]: I0308 03:32:24.661724 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.682931 master-0 kubenswrapper[13046]: I0308 03:32:24.682871 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-m5cdg"] Mar 08 03:32:24.736504 master-0 kubenswrapper[13046]: I0308 03:32:24.733444 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-bound-sa-token\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.736504 master-0 kubenswrapper[13046]: I0308 03:32:24.733565 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpj8\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-kube-api-access-gfpj8\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.803653 master-0 kubenswrapper[13046]: I0308 03:32:24.803603 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v7w8w"] Mar 08 03:32:24.804459 master-0 kubenswrapper[13046]: I0308 03:32:24.804442 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.818735 master-0 kubenswrapper[13046]: I0308 03:32:24.818702 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v7w8w"] Mar 08 03:32:24.835448 master-0 kubenswrapper[13046]: I0308 03:32:24.835388 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpj8\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-kube-api-access-gfpj8\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.835771 master-0 kubenswrapper[13046]: I0308 03:32:24.835752 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.835916 master-0 kubenswrapper[13046]: I0308 03:32:24.835897 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vjh\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-kube-api-access-49vjh\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.836049 master-0 kubenswrapper[13046]: I0308 03:32:24.836030 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-bound-sa-token\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.855650 master-0 kubenswrapper[13046]: I0308 03:32:24.855607 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-bound-sa-token\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.857877 master-0 kubenswrapper[13046]: I0308 03:32:24.857839 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpj8\" (UniqueName: \"kubernetes.io/projected/f85f7be1-ef18-4178-b853-74565c068ac9-kube-api-access-gfpj8\") pod \"cert-manager-545d4d4674-m5cdg\" (UID: \"f85f7be1-ef18-4178-b853-74565c068ac9\") " pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.940970 master-0 kubenswrapper[13046]: I0308 03:32:24.940865 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.940970 master-0 kubenswrapper[13046]: I0308 03:32:24.940917 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vjh\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-kube-api-access-49vjh\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.957205 master-0 kubenswrapper[13046]: I0308 03:32:24.957151 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vjh\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-kube-api-access-49vjh\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.959117 master-0 kubenswrapper[13046]: I0308 03:32:24.958203 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c0704c-4295-41f3-993d-b46def585139-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v7w8w\" (UID: \"16c0704c-4295-41f3-993d-b46def585139\") " pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:24.974836 master-0 kubenswrapper[13046]: I0308 03:32:24.974540 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-m5cdg" Mar 08 03:32:24.986525 master-0 kubenswrapper[13046]: I0308 03:32:24.986422 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" event={"ID":"901728e5-edfb-4a63-a5bf-62c26ee8e926","Type":"ContainerStarted","Data":"09af0c429acd938cf7169a35b765b47865123e08e1a5093c42e6625c2b018b32"} Mar 08 03:32:24.989557 master-0 kubenswrapper[13046]: I0308 03:32:24.988271 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" event={"ID":"a123bab3-dc4d-42e5-a156-8dc6c3612334","Type":"ContainerStarted","Data":"461de013e1f538d89ea6e5a11e0d338a2befb18e3b7cfa56ab4b61c19d524147"} Mar 08 03:32:25.011576 master-0 kubenswrapper[13046]: I0308 03:32:25.010759 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-kbtxq" podStartSLOduration=2.261877883 podStartE2EDuration="8.010660074s" podCreationTimestamp="2026-03-08 03:32:17 +0000 UTC" firstStartedPulling="2026-03-08 03:32:18.493619547 +0000 UTC m=+1140.572386764" lastFinishedPulling="2026-03-08 03:32:24.242401738 +0000 UTC m=+1146.321168955" observedRunningTime="2026-03-08 03:32:25.006854177 +0000 UTC m=+1147.085621394" watchObservedRunningTime="2026-03-08 03:32:25.010660074 +0000 UTC m=+1147.089427301" Mar 08 03:32:25.041709 master-0 kubenswrapper[13046]: I0308 03:32:25.041627 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-vhkqq" podStartSLOduration=1.798581641 podStartE2EDuration="7.04161167s" podCreationTimestamp="2026-03-08 03:32:18 +0000 UTC" firstStartedPulling="2026-03-08 03:32:18.974569368 +0000 UTC m=+1141.053336585" lastFinishedPulling="2026-03-08 03:32:24.217599357 +0000 UTC m=+1146.296366614" observedRunningTime="2026-03-08 03:32:25.037983797 +0000 UTC m=+1147.116751014" watchObservedRunningTime="2026-03-08 03:32:25.04161167 +0000 UTC m=+1147.120378887" Mar 08 03:32:25.119284 master-0 kubenswrapper[13046]: I0308 03:32:25.118880 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:25.435639 master-0 kubenswrapper[13046]: I0308 03:32:25.435027 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-m5cdg"] Mar 08 03:32:25.446265 master-0 kubenswrapper[13046]: W0308 03:32:25.445740 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85f7be1_ef18_4178_b853_74565c068ac9.slice/crio-b3ef35657b8e7a3afaa8767b905363ce123c40f30bd7dc46e0342fa01392566a WatchSource:0}: Error finding container b3ef35657b8e7a3afaa8767b905363ce123c40f30bd7dc46e0342fa01392566a: Status 404 returned error can't find the container with id b3ef35657b8e7a3afaa8767b905363ce123c40f30bd7dc46e0342fa01392566a Mar 08 03:32:25.537330 master-0 kubenswrapper[13046]: I0308 03:32:25.537261 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v7w8w"] Mar 08 03:32:25.546568 master-0 kubenswrapper[13046]: W0308 03:32:25.546504 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16c0704c_4295_41f3_993d_b46def585139.slice/crio-def4800dcec1bcbc64878d13e607e971c7404c67094cbd2e694d6ca8f7e2b975 WatchSource:0}: Error finding container def4800dcec1bcbc64878d13e607e971c7404c67094cbd2e694d6ca8f7e2b975: Status 404 returned error can't find the container with id def4800dcec1bcbc64878d13e607e971c7404c67094cbd2e694d6ca8f7e2b975 Mar 08 03:32:25.995492 master-0 kubenswrapper[13046]: I0308 03:32:25.995434 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" event={"ID":"16c0704c-4295-41f3-993d-b46def585139","Type":"ContainerStarted","Data":"89a56369f3ce990aa0daa52cab803971b4557d9b3279ff01f1b8a85d7af0f3a7"} Mar 08 03:32:25.995492 master-0 kubenswrapper[13046]: I0308 03:32:25.995494 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" event={"ID":"16c0704c-4295-41f3-993d-b46def585139","Type":"ContainerStarted","Data":"def4800dcec1bcbc64878d13e607e971c7404c67094cbd2e694d6ca8f7e2b975"} Mar 08 03:32:25.996299 master-0 kubenswrapper[13046]: I0308 03:32:25.996279 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:25.998247 master-0 kubenswrapper[13046]: I0308 03:32:25.998220 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-m5cdg" event={"ID":"f85f7be1-ef18-4178-b853-74565c068ac9","Type":"ContainerStarted","Data":"12d0e7794857bc1e4d762b297517963134d4b7b03b547c66ac84f7ae0e345b61"} Mar 08 03:32:25.998316 master-0 kubenswrapper[13046]: I0308 03:32:25.998246 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-m5cdg" event={"ID":"f85f7be1-ef18-4178-b853-74565c068ac9","Type":"ContainerStarted","Data":"b3ef35657b8e7a3afaa8767b905363ce123c40f30bd7dc46e0342fa01392566a"} Mar 08 03:32:26.096242 master-0 kubenswrapper[13046]: I0308 03:32:26.096178 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-m5cdg" podStartSLOduration=2.096163503 podStartE2EDuration="2.096163503s" podCreationTimestamp="2026-03-08 03:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:32:26.093265241 +0000 UTC m=+1148.172032458" watchObservedRunningTime="2026-03-08 03:32:26.096163503 +0000 UTC m=+1148.174930720" Mar 08 03:32:26.103905 master-0 kubenswrapper[13046]: I0308 03:32:26.097513 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" podStartSLOduration=2.097507311 podStartE2EDuration="2.097507311s" podCreationTimestamp="2026-03-08 03:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:32:26.029641332 +0000 UTC m=+1148.108408549" watchObservedRunningTime="2026-03-08 03:32:26.097507311 +0000 UTC m=+1148.176274538" Mar 08 03:32:26.902370 master-0 kubenswrapper[13046]: I0308 03:32:26.902321 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t"] Mar 08 03:32:26.903336 master-0 kubenswrapper[13046]: I0308 03:32:26.903311 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:26.907686 master-0 kubenswrapper[13046]: I0308 03:32:26.907654 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 03:32:26.907927 master-0 kubenswrapper[13046]: I0308 03:32:26.907905 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 03:32:26.907972 master-0 kubenswrapper[13046]: I0308 03:32:26.907937 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 03:32:26.910072 master-0 kubenswrapper[13046]: I0308 03:32:26.910038 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 03:32:26.936196 master-0 kubenswrapper[13046]: I0308 03:32:26.936139 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t"] Mar 08 03:32:27.002506 master-0 kubenswrapper[13046]: I0308 03:32:27.000562 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-apiservice-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.002506 master-0 kubenswrapper[13046]: I0308 03:32:27.000629 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfx4s\" (UniqueName: \"kubernetes.io/projected/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-kube-api-access-dfx4s\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.002506 master-0 kubenswrapper[13046]: I0308 03:32:27.000670 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-webhook-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.102305 master-0 kubenswrapper[13046]: I0308 03:32:27.102250 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-apiservice-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.102526 master-0 kubenswrapper[13046]: I0308 03:32:27.102317 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfx4s\" (UniqueName: \"kubernetes.io/projected/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-kube-api-access-dfx4s\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.102526 master-0 kubenswrapper[13046]: I0308 03:32:27.102517 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-webhook-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.105723 master-0 kubenswrapper[13046]: I0308 03:32:27.105683 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-webhook-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.107151 master-0 kubenswrapper[13046]: I0308 03:32:27.107126 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-apiservice-cert\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.141957 master-0 kubenswrapper[13046]: I0308 03:32:27.141916 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfx4s\" (UniqueName: \"kubernetes.io/projected/e1d15c8d-0326-4e12-bdba-ed6df8b88ed0-kube-api-access-dfx4s\") pod \"metallb-operator-controller-manager-68cfc6845d-mhm6t\" (UID: \"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0\") " pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.216481 master-0 kubenswrapper[13046]: I0308 03:32:27.216390 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:27.615068 master-0 kubenswrapper[13046]: I0308 03:32:27.612615 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t"] Mar 08 03:32:27.639673 master-0 kubenswrapper[13046]: I0308 03:32:27.639635 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl"] Mar 08 03:32:27.651081 master-0 kubenswrapper[13046]: I0308 03:32:27.651045 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.663673 master-0 kubenswrapper[13046]: I0308 03:32:27.663635 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 03:32:27.663943 master-0 kubenswrapper[13046]: I0308 03:32:27.663925 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 03:32:27.672979 master-0 kubenswrapper[13046]: I0308 03:32:27.672929 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl"] Mar 08 03:32:27.854283 master-0 kubenswrapper[13046]: I0308 03:32:27.854232 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-webhook-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.854480 master-0 kubenswrapper[13046]: I0308 03:32:27.854355 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.854480 master-0 kubenswrapper[13046]: I0308 03:32:27.854384 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw782\" (UniqueName: \"kubernetes.io/projected/738a1cd1-8f37-4d94-abeb-36e19b8653b3-kube-api-access-qw782\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.956373 master-0 kubenswrapper[13046]: I0308 03:32:27.956261 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.956373 master-0 kubenswrapper[13046]: I0308 03:32:27.956311 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw782\" (UniqueName: \"kubernetes.io/projected/738a1cd1-8f37-4d94-abeb-36e19b8653b3-kube-api-access-qw782\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.956611 master-0 kubenswrapper[13046]: I0308 03:32:27.956392 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-webhook-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.963353 master-0 kubenswrapper[13046]: I0308 03:32:27.963301 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-apiservice-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.966346 master-0 kubenswrapper[13046]: I0308 03:32:27.966310 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/738a1cd1-8f37-4d94-abeb-36e19b8653b3-webhook-cert\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:27.980291 master-0 kubenswrapper[13046]: I0308 03:32:27.980240 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw782\" (UniqueName: \"kubernetes.io/projected/738a1cd1-8f37-4d94-abeb-36e19b8653b3-kube-api-access-qw782\") pod \"metallb-operator-webhook-server-7fb7496f9c-6jvkl\" (UID: \"738a1cd1-8f37-4d94-abeb-36e19b8653b3\") " pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:28.016081 master-0 kubenswrapper[13046]: I0308 03:32:28.016015 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:28.024840 master-0 kubenswrapper[13046]: I0308 03:32:28.024780 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" event={"ID":"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0","Type":"ContainerStarted","Data":"a35b93b6906fa055037fa9363d67b870f323ec2bc3abf3abfd071b83d771f809"} Mar 08 03:32:28.606103 master-0 kubenswrapper[13046]: W0308 03:32:28.606054 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod738a1cd1_8f37_4d94_abeb_36e19b8653b3.slice/crio-f5ffb991119595cc4350a3dc3bcae9b08e89cebf713807d97e4d1702702bf514 WatchSource:0}: Error finding container f5ffb991119595cc4350a3dc3bcae9b08e89cebf713807d97e4d1702702bf514: Status 404 returned error can't find the container with id f5ffb991119595cc4350a3dc3bcae9b08e89cebf713807d97e4d1702702bf514 Mar 08 03:32:28.607206 master-0 kubenswrapper[13046]: I0308 03:32:28.607152 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl"] Mar 08 03:32:29.032599 master-0 kubenswrapper[13046]: I0308 03:32:29.032536 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" event={"ID":"738a1cd1-8f37-4d94-abeb-36e19b8653b3","Type":"ContainerStarted","Data":"f5ffb991119595cc4350a3dc3bcae9b08e89cebf713807d97e4d1702702bf514"} Mar 08 03:32:30.148038 master-0 kubenswrapper[13046]: I0308 03:32:30.140357 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-v7w8w" Mar 08 03:32:33.080251 master-0 kubenswrapper[13046]: I0308 03:32:33.080179 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" event={"ID":"e1d15c8d-0326-4e12-bdba-ed6df8b88ed0","Type":"ContainerStarted","Data":"0707ffd26cecfd5a617d247bcb9ba3d42adbc4508738f1f54f9b79c8b849fa45"} Mar 08 03:32:33.081272 master-0 kubenswrapper[13046]: I0308 03:32:33.081240 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:32:33.118238 master-0 kubenswrapper[13046]: I0308 03:32:33.118158 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" podStartSLOduration=2.387138 podStartE2EDuration="7.118139209s" podCreationTimestamp="2026-03-08 03:32:26 +0000 UTC" firstStartedPulling="2026-03-08 03:32:27.64876635 +0000 UTC m=+1149.727533567" lastFinishedPulling="2026-03-08 03:32:32.379767569 +0000 UTC m=+1154.458534776" observedRunningTime="2026-03-08 03:32:33.11393916 +0000 UTC m=+1155.192706377" watchObservedRunningTime="2026-03-08 03:32:33.118139209 +0000 UTC m=+1155.196906416" Mar 08 03:32:36.176513 master-0 kubenswrapper[13046]: I0308 03:32:36.164604 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" event={"ID":"738a1cd1-8f37-4d94-abeb-36e19b8653b3","Type":"ContainerStarted","Data":"f2c393e134df45eccc848aab60b7017882580b06d3a7f72e65f18cbbf504541f"} Mar 08 03:32:36.176513 master-0 kubenswrapper[13046]: I0308 03:32:36.164701 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:36.176513 master-0 kubenswrapper[13046]: I0308 03:32:36.167459 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" podStartSLOduration=2.307749353 podStartE2EDuration="9.16743545s" podCreationTimestamp="2026-03-08 03:32:27 +0000 UTC" firstStartedPulling="2026-03-08 03:32:28.613370888 +0000 UTC m=+1150.692138225" lastFinishedPulling="2026-03-08 03:32:35.473057105 +0000 UTC m=+1157.551824322" observedRunningTime="2026-03-08 03:32:36.161866442 +0000 UTC m=+1158.240633689" watchObservedRunningTime="2026-03-08 03:32:36.16743545 +0000 UTC m=+1158.246202687" Mar 08 03:32:36.324221 master-0 kubenswrapper[13046]: I0308 03:32:36.324154 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q"] Mar 08 03:32:36.325102 master-0 kubenswrapper[13046]: I0308 03:32:36.325045 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" Mar 08 03:32:36.327918 master-0 kubenswrapper[13046]: I0308 03:32:36.327875 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 03:32:36.328152 master-0 kubenswrapper[13046]: I0308 03:32:36.328135 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 03:32:36.369290 master-0 kubenswrapper[13046]: I0308 03:32:36.341349 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlccc\" (UniqueName: \"kubernetes.io/projected/6b238b7b-b7a1-4015-9d14-e7406c041e99-kube-api-access-xlccc\") pod \"obo-prometheus-operator-68bc856cb9-9g89q\" (UID: \"6b238b7b-b7a1-4015-9d14-e7406c041e99\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" Mar 08 03:32:36.369290 master-0 kubenswrapper[13046]: I0308 03:32:36.368391 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q"] Mar 08 03:32:36.443557 master-0 kubenswrapper[13046]: I0308 03:32:36.443428 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlccc\" (UniqueName: \"kubernetes.io/projected/6b238b7b-b7a1-4015-9d14-e7406c041e99-kube-api-access-xlccc\") pod \"obo-prometheus-operator-68bc856cb9-9g89q\" (UID: \"6b238b7b-b7a1-4015-9d14-e7406c041e99\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" Mar 08 03:32:36.466217 master-0 kubenswrapper[13046]: I0308 03:32:36.466182 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlccc\" (UniqueName: \"kubernetes.io/projected/6b238b7b-b7a1-4015-9d14-e7406c041e99-kube-api-access-xlccc\") pod \"obo-prometheus-operator-68bc856cb9-9g89q\" (UID: \"6b238b7b-b7a1-4015-9d14-e7406c041e99\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" Mar 08 03:32:36.467757 master-0 kubenswrapper[13046]: I0308 03:32:36.467408 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j"] Mar 08 03:32:36.469162 master-0 kubenswrapper[13046]: I0308 03:32:36.469103 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.470967 master-0 kubenswrapper[13046]: I0308 03:32:36.470780 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 03:32:36.484608 master-0 kubenswrapper[13046]: I0308 03:32:36.482975 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j"] Mar 08 03:32:36.490616 master-0 kubenswrapper[13046]: I0308 03:32:36.490573 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl"] Mar 08 03:32:36.493203 master-0 kubenswrapper[13046]: I0308 03:32:36.493182 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.542910 master-0 kubenswrapper[13046]: I0308 03:32:36.542842 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl"] Mar 08 03:32:36.550353 master-0 kubenswrapper[13046]: I0308 03:32:36.550271 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.550448 master-0 kubenswrapper[13046]: I0308 03:32:36.550378 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.550448 master-0 kubenswrapper[13046]: I0308 03:32:36.550438 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.550770 master-0 kubenswrapper[13046]: I0308 03:32:36.550516 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.642746 master-0 kubenswrapper[13046]: I0308 03:32:36.642637 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8m6ll"] Mar 08 03:32:36.644976 master-0 kubenswrapper[13046]: I0308 03:32:36.644930 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.646938 master-0 kubenswrapper[13046]: I0308 03:32:36.646908 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 03:32:36.652547 master-0 kubenswrapper[13046]: I0308 03:32:36.652467 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.652716 master-0 kubenswrapper[13046]: I0308 03:32:36.652576 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.652716 master-0 kubenswrapper[13046]: I0308 03:32:36.652627 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.652716 master-0 kubenswrapper[13046]: I0308 03:32:36.652682 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.659499 master-0 kubenswrapper[13046]: I0308 03:32:36.657727 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.659499 master-0 kubenswrapper[13046]: I0308 03:32:36.657815 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.662728 master-0 kubenswrapper[13046]: I0308 03:32:36.661926 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a82d9d9b-48e5-4262-ab21-37409995c543-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl\" (UID: \"a82d9d9b-48e5-4262-ab21-37409995c543\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.662728 master-0 kubenswrapper[13046]: I0308 03:32:36.662358 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j\" (UID: \"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.662728 master-0 kubenswrapper[13046]: I0308 03:32:36.662447 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8m6ll"] Mar 08 03:32:36.687920 master-0 kubenswrapper[13046]: I0308 03:32:36.687856 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" Mar 08 03:32:36.755419 master-0 kubenswrapper[13046]: I0308 03:32:36.755359 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97636dbb-042d-43ec-ad26-bb802c7af1e1-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.755419 master-0 kubenswrapper[13046]: I0308 03:32:36.755427 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh5mp\" (UniqueName: \"kubernetes.io/projected/97636dbb-042d-43ec-ad26-bb802c7af1e1-kube-api-access-dh5mp\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.816866 master-0 kubenswrapper[13046]: I0308 03:32:36.816790 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" Mar 08 03:32:36.838427 master-0 kubenswrapper[13046]: I0308 03:32:36.838378 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" Mar 08 03:32:36.857888 master-0 kubenswrapper[13046]: I0308 03:32:36.857813 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97636dbb-042d-43ec-ad26-bb802c7af1e1-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.857888 master-0 kubenswrapper[13046]: I0308 03:32:36.857883 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh5mp\" (UniqueName: \"kubernetes.io/projected/97636dbb-042d-43ec-ad26-bb802c7af1e1-kube-api-access-dh5mp\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.860969 master-0 kubenswrapper[13046]: I0308 03:32:36.860945 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/97636dbb-042d-43ec-ad26-bb802c7af1e1-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.879957 master-0 kubenswrapper[13046]: I0308 03:32:36.879454 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-94c4t"] Mar 08 03:32:36.879957 master-0 kubenswrapper[13046]: I0308 03:32:36.879903 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh5mp\" (UniqueName: \"kubernetes.io/projected/97636dbb-042d-43ec-ad26-bb802c7af1e1-kube-api-access-dh5mp\") pod \"observability-operator-59bdc8b94-8m6ll\" (UID: \"97636dbb-042d-43ec-ad26-bb802c7af1e1\") " pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:36.880494 master-0 kubenswrapper[13046]: I0308 03:32:36.880451 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:36.882458 master-0 kubenswrapper[13046]: I0308 03:32:36.882366 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-94c4t"] Mar 08 03:32:37.001667 master-0 kubenswrapper[13046]: I0308 03:32:37.001206 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:37.063589 master-0 kubenswrapper[13046]: I0308 03:32:37.062084 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xlmk\" (UniqueName: \"kubernetes.io/projected/0098b3e6-efa5-424f-99ab-d487c0857ccd-kube-api-access-6xlmk\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.063589 master-0 kubenswrapper[13046]: I0308 03:32:37.062255 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0098b3e6-efa5-424f-99ab-d487c0857ccd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.163763 master-0 kubenswrapper[13046]: I0308 03:32:37.163654 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0098b3e6-efa5-424f-99ab-d487c0857ccd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.163763 master-0 kubenswrapper[13046]: I0308 03:32:37.163739 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xlmk\" (UniqueName: \"kubernetes.io/projected/0098b3e6-efa5-424f-99ab-d487c0857ccd-kube-api-access-6xlmk\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.165084 master-0 kubenswrapper[13046]: I0308 03:32:37.164699 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0098b3e6-efa5-424f-99ab-d487c0857ccd-openshift-service-ca\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.339509 master-0 kubenswrapper[13046]: I0308 03:32:37.338522 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xlmk\" (UniqueName: \"kubernetes.io/projected/0098b3e6-efa5-424f-99ab-d487c0857ccd-kube-api-access-6xlmk\") pod \"perses-operator-5bf474d74f-94c4t\" (UID: \"0098b3e6-efa5-424f-99ab-d487c0857ccd\") " pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.537128 master-0 kubenswrapper[13046]: I0308 03:32:37.536955 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:37.754032 master-0 kubenswrapper[13046]: W0308 03:32:37.750379 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b238b7b_b7a1_4015_9d14_e7406c041e99.slice/crio-0f930ad744994043a1cf1a00c041abdcb6e3128ad18955a25aa0975a12cf80ab WatchSource:0}: Error finding container 0f930ad744994043a1cf1a00c041abdcb6e3128ad18955a25aa0975a12cf80ab: Status 404 returned error can't find the container with id 0f930ad744994043a1cf1a00c041abdcb6e3128ad18955a25aa0975a12cf80ab Mar 08 03:32:37.754032 master-0 kubenswrapper[13046]: I0308 03:32:37.751038 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q"] Mar 08 03:32:37.808924 master-0 kubenswrapper[13046]: I0308 03:32:37.808624 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j"] Mar 08 03:32:37.815795 master-0 kubenswrapper[13046]: I0308 03:32:37.814709 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl"] Mar 08 03:32:37.822840 master-0 kubenswrapper[13046]: I0308 03:32:37.822785 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8m6ll"] Mar 08 03:32:38.082956 master-0 kubenswrapper[13046]: I0308 03:32:38.082909 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-94c4t"] Mar 08 03:32:38.088602 master-0 kubenswrapper[13046]: W0308 03:32:38.087901 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0098b3e6_efa5_424f_99ab_d487c0857ccd.slice/crio-020ac0675a2fb4a45045fd19eea5a3f1ef9f252c67e93c2f9c0f67cb6180e5e6 WatchSource:0}: Error finding container 020ac0675a2fb4a45045fd19eea5a3f1ef9f252c67e93c2f9c0f67cb6180e5e6: Status 404 returned error can't find the container with id 020ac0675a2fb4a45045fd19eea5a3f1ef9f252c67e93c2f9c0f67cb6180e5e6 Mar 08 03:32:38.171570 master-0 kubenswrapper[13046]: I0308 03:32:38.171518 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" event={"ID":"97636dbb-042d-43ec-ad26-bb802c7af1e1","Type":"ContainerStarted","Data":"857f99a77fc096de81b7e2ffb231ca7f1bb9f203e8740a801eedbb55ab13984c"} Mar 08 03:32:38.173400 master-0 kubenswrapper[13046]: I0308 03:32:38.172636 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" event={"ID":"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48","Type":"ContainerStarted","Data":"61bcc9ad37f41cbd7f9d958b2a651a3ce1b68c72e0b3a892a7e98f9b73d57c8d"} Mar 08 03:32:38.174228 master-0 kubenswrapper[13046]: I0308 03:32:38.174198 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" event={"ID":"6b238b7b-b7a1-4015-9d14-e7406c041e99","Type":"ContainerStarted","Data":"0f930ad744994043a1cf1a00c041abdcb6e3128ad18955a25aa0975a12cf80ab"} Mar 08 03:32:38.176744 master-0 kubenswrapper[13046]: I0308 03:32:38.176704 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" event={"ID":"0098b3e6-efa5-424f-99ab-d487c0857ccd","Type":"ContainerStarted","Data":"020ac0675a2fb4a45045fd19eea5a3f1ef9f252c67e93c2f9c0f67cb6180e5e6"} Mar 08 03:32:38.178250 master-0 kubenswrapper[13046]: I0308 03:32:38.178218 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" event={"ID":"a82d9d9b-48e5-4262-ab21-37409995c543","Type":"ContainerStarted","Data":"a58849ddc42523de39dbf427576c0c12deeec4416c59c7b024078d25ff2814c9"} Mar 08 03:32:48.021155 master-0 kubenswrapper[13046]: I0308 03:32:48.021068 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7fb7496f9c-6jvkl" Mar 08 03:32:50.568593 master-0 kubenswrapper[13046]: I0308 03:32:50.568526 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" event={"ID":"97636dbb-042d-43ec-ad26-bb802c7af1e1","Type":"ContainerStarted","Data":"10d02b96c0f73e93233d149a48ac3c63f82845ca75617381ca66cb034aa6a275"} Mar 08 03:32:50.570284 master-0 kubenswrapper[13046]: I0308 03:32:50.570169 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:50.575543 master-0 kubenswrapper[13046]: I0308 03:32:50.574628 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" event={"ID":"69c9314a-ee4b-4e3c-a5bb-9bc7b4d50a48","Type":"ContainerStarted","Data":"abb7e33af7063b7acd4a6bc071be811b064d588fccb3a326518b7000524474cd"} Mar 08 03:32:50.579050 master-0 kubenswrapper[13046]: I0308 03:32:50.578933 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" event={"ID":"6b238b7b-b7a1-4015-9d14-e7406c041e99","Type":"ContainerStarted","Data":"486a7b87cd5702c3303ec9ecff6229b93eba89e899267c3916669036068a895b"} Mar 08 03:32:50.584577 master-0 kubenswrapper[13046]: I0308 03:32:50.584398 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" event={"ID":"0098b3e6-efa5-424f-99ab-d487c0857ccd","Type":"ContainerStarted","Data":"27ea0644338cf8f4434937ab4fb6bc9ee7b19c50d24765f60a1cea647faa4376"} Mar 08 03:32:50.585665 master-0 kubenswrapper[13046]: I0308 03:32:50.585617 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:32:50.588172 master-0 kubenswrapper[13046]: I0308 03:32:50.588122 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" event={"ID":"a82d9d9b-48e5-4262-ab21-37409995c543","Type":"ContainerStarted","Data":"fa907ffbce7bffa838fbb22258c616d35bef1fc1a8b109b6176ad584147a9f47"} Mar 08 03:32:50.601227 master-0 kubenswrapper[13046]: I0308 03:32:50.601107 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" podStartSLOduration=2.990494594 podStartE2EDuration="14.601049561s" podCreationTimestamp="2026-03-08 03:32:36 +0000 UTC" firstStartedPulling="2026-03-08 03:32:37.834785192 +0000 UTC m=+1159.913552409" lastFinishedPulling="2026-03-08 03:32:49.445340159 +0000 UTC m=+1171.524107376" observedRunningTime="2026-03-08 03:32:50.600474394 +0000 UTC m=+1172.679241611" watchObservedRunningTime="2026-03-08 03:32:50.601049561 +0000 UTC m=+1172.679816828" Mar 08 03:32:50.630825 master-0 kubenswrapper[13046]: I0308 03:32:50.630655 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" podStartSLOduration=3.307472428 podStartE2EDuration="14.630639047s" podCreationTimestamp="2026-03-08 03:32:36 +0000 UTC" firstStartedPulling="2026-03-08 03:32:38.096701909 +0000 UTC m=+1160.175469136" lastFinishedPulling="2026-03-08 03:32:49.419868538 +0000 UTC m=+1171.498635755" observedRunningTime="2026-03-08 03:32:50.629114214 +0000 UTC m=+1172.707881451" watchObservedRunningTime="2026-03-08 03:32:50.630639047 +0000 UTC m=+1172.709406264" Mar 08 03:32:50.637554 master-0 kubenswrapper[13046]: I0308 03:32:50.637445 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8m6ll" Mar 08 03:32:50.657665 master-0 kubenswrapper[13046]: I0308 03:32:50.654878 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-8rt7j" podStartSLOduration=3.008676427 podStartE2EDuration="14.654858392s" podCreationTimestamp="2026-03-08 03:32:36 +0000 UTC" firstStartedPulling="2026-03-08 03:32:37.750545869 +0000 UTC m=+1159.829313086" lastFinishedPulling="2026-03-08 03:32:49.396727834 +0000 UTC m=+1171.475495051" observedRunningTime="2026-03-08 03:32:50.65479895 +0000 UTC m=+1172.733566167" watchObservedRunningTime="2026-03-08 03:32:50.654858392 +0000 UTC m=+1172.733625619" Mar 08 03:32:50.709247 master-0 kubenswrapper[13046]: I0308 03:32:50.708549 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-66d9f5965d-j57tl" podStartSLOduration=3.139392505 podStartE2EDuration="14.708477079s" podCreationTimestamp="2026-03-08 03:32:36 +0000 UTC" firstStartedPulling="2026-03-08 03:32:37.808623082 +0000 UTC m=+1159.887390299" lastFinishedPulling="2026-03-08 03:32:49.377707646 +0000 UTC m=+1171.456474873" observedRunningTime="2026-03-08 03:32:50.706991306 +0000 UTC m=+1172.785758513" watchObservedRunningTime="2026-03-08 03:32:50.708477079 +0000 UTC m=+1172.787244286" Mar 08 03:32:50.763866 master-0 kubenswrapper[13046]: I0308 03:32:50.763513 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-9g89q" podStartSLOduration=3.09645675 podStartE2EDuration="14.763469764s" podCreationTimestamp="2026-03-08 03:32:36 +0000 UTC" firstStartedPulling="2026-03-08 03:32:37.753538764 +0000 UTC m=+1159.832305981" lastFinishedPulling="2026-03-08 03:32:49.420551778 +0000 UTC m=+1171.499318995" observedRunningTime="2026-03-08 03:32:50.762891227 +0000 UTC m=+1172.841658454" watchObservedRunningTime="2026-03-08 03:32:50.763469764 +0000 UTC m=+1172.842236991" Mar 08 03:32:57.541779 master-0 kubenswrapper[13046]: I0308 03:32:57.541623 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-94c4t" Mar 08 03:33:07.219685 master-0 kubenswrapper[13046]: I0308 03:33:07.219630 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68cfc6845d-mhm6t" Mar 08 03:33:15.263507 master-0 kubenswrapper[13046]: I0308 03:33:15.263413 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-546wf"] Mar 08 03:33:15.265762 master-0 kubenswrapper[13046]: I0308 03:33:15.265157 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.268466 master-0 kubenswrapper[13046]: I0308 03:33:15.268023 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 03:33:15.296220 master-0 kubenswrapper[13046]: I0308 03:33:15.296166 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-546wf"] Mar 08 03:33:15.318720 master-0 kubenswrapper[13046]: I0308 03:33:15.317114 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7jbsp"] Mar 08 03:33:15.321854 master-0 kubenswrapper[13046]: I0308 03:33:15.321814 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.328768 master-0 kubenswrapper[13046]: I0308 03:33:15.328706 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 03:33:15.328878 master-0 kubenswrapper[13046]: I0308 03:33:15.328816 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.395967 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-z8zzn"] Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.397419 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399167 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399200 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhx8d\" (UniqueName: \"kubernetes.io/projected/202b3558-b98e-401f-9c22-529f5a27dd5b-kube-api-access-qhx8d\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399234 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-conf\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399259 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-sockets\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399284 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-startup\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399308 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-reloader\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399335 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics-certs\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399363 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmxd\" (UniqueName: \"kubernetes.io/projected/2b5bd505-a6ef-490d-b7b4-83412df76a4f-kube-api-access-pjmxd\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.399400 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b5bd505-a6ef-490d-b7b4-83412df76a4f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.401848 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.401919 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 03:33:15.401973 master-0 kubenswrapper[13046]: I0308 03:33:15.401947 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 03:33:15.423733 master-0 kubenswrapper[13046]: I0308 03:33:15.423571 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-n8p77"] Mar 08 03:33:15.430234 master-0 kubenswrapper[13046]: I0308 03:33:15.425617 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.433912 master-0 kubenswrapper[13046]: I0308 03:33:15.432786 13046 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 03:33:15.440942 master-0 kubenswrapper[13046]: I0308 03:33:15.440711 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n8p77"] Mar 08 03:33:15.501127 master-0 kubenswrapper[13046]: I0308 03:33:15.501057 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhx8d\" (UniqueName: \"kubernetes.io/projected/202b3558-b98e-401f-9c22-529f5a27dd5b-kube-api-access-qhx8d\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501146 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-conf\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501181 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-sockets\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501209 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sscdm\" (UniqueName: \"kubernetes.io/projected/72cc246d-ba12-4435-90fb-e8a0c307bb48-kube-api-access-sscdm\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501236 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/72cc246d-ba12-4435-90fb-e8a0c307bb48-metallb-excludel2\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501261 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-metrics-certs\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.501310 master-0 kubenswrapper[13046]: I0308 03:33:15.501281 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-cert\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501313 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-startup\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501333 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndlz\" (UniqueName: \"kubernetes.io/projected/d8655289-e199-48db-be5c-78f68514a515-kube-api-access-8ndlz\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501366 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-reloader\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501388 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics-certs\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501427 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmxd\" (UniqueName: \"kubernetes.io/projected/2b5bd505-a6ef-490d-b7b4-83412df76a4f-kube-api-access-pjmxd\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.501537 master-0 kubenswrapper[13046]: I0308 03:33:15.501498 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b5bd505-a6ef-490d-b7b4-83412df76a4f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.501713 master-0 kubenswrapper[13046]: I0308 03:33:15.501544 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.501713 master-0 kubenswrapper[13046]: I0308 03:33:15.501575 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.502050 master-0 kubenswrapper[13046]: I0308 03:33:15.502019 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-reloader\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.502221 master-0 kubenswrapper[13046]: I0308 03:33:15.502192 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-conf\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.502398 master-0 kubenswrapper[13046]: I0308 03:33:15.502375 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-sockets\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.502437 master-0 kubenswrapper[13046]: I0308 03:33:15.502417 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.502619 master-0 kubenswrapper[13046]: I0308 03:33:15.502598 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.504924 master-0 kubenswrapper[13046]: I0308 03:33:15.503289 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/202b3558-b98e-401f-9c22-529f5a27dd5b-frr-startup\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.505882 master-0 kubenswrapper[13046]: I0308 03:33:15.505848 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b3558-b98e-401f-9c22-529f5a27dd5b-metrics-certs\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.505971 master-0 kubenswrapper[13046]: I0308 03:33:15.505947 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b5bd505-a6ef-490d-b7b4-83412df76a4f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.526556 master-0 kubenswrapper[13046]: I0308 03:33:15.525278 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhx8d\" (UniqueName: \"kubernetes.io/projected/202b3558-b98e-401f-9c22-529f5a27dd5b-kube-api-access-qhx8d\") pod \"frr-k8s-7jbsp\" (UID: \"202b3558-b98e-401f-9c22-529f5a27dd5b\") " pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.531662 master-0 kubenswrapper[13046]: I0308 03:33:15.531604 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmxd\" (UniqueName: \"kubernetes.io/projected/2b5bd505-a6ef-490d-b7b4-83412df76a4f-kube-api-access-pjmxd\") pod \"frr-k8s-webhook-server-7f989f654f-546wf\" (UID: \"2b5bd505-a6ef-490d-b7b4-83412df76a4f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.590246 master-0 kubenswrapper[13046]: I0308 03:33:15.590177 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:15.604562 master-0 kubenswrapper[13046]: I0308 03:33:15.604517 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.604654 master-0 kubenswrapper[13046]: I0308 03:33:15.604570 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.604654 master-0 kubenswrapper[13046]: I0308 03:33:15.604623 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sscdm\" (UniqueName: \"kubernetes.io/projected/72cc246d-ba12-4435-90fb-e8a0c307bb48-kube-api-access-sscdm\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.604654 master-0 kubenswrapper[13046]: I0308 03:33:15.604641 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/72cc246d-ba12-4435-90fb-e8a0c307bb48-metallb-excludel2\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.604809 master-0 kubenswrapper[13046]: I0308 03:33:15.604657 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-metrics-certs\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.604809 master-0 kubenswrapper[13046]: I0308 03:33:15.604672 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-cert\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.604809 master-0 kubenswrapper[13046]: I0308 03:33:15.604695 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndlz\" (UniqueName: \"kubernetes.io/projected/d8655289-e199-48db-be5c-78f68514a515-kube-api-access-8ndlz\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.605353 master-0 kubenswrapper[13046]: E0308 03:33:15.605316 13046 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 03:33:15.605439 master-0 kubenswrapper[13046]: E0308 03:33:15.605371 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist podName:72cc246d-ba12-4435-90fb-e8a0c307bb48 nodeName:}" failed. No retries permitted until 2026-03-08 03:33:16.105354931 +0000 UTC m=+1198.184122148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist") pod "speaker-z8zzn" (UID: "72cc246d-ba12-4435-90fb-e8a0c307bb48") : secret "metallb-memberlist" not found Mar 08 03:33:15.605562 master-0 kubenswrapper[13046]: E0308 03:33:15.605532 13046 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 08 03:33:15.605562 master-0 kubenswrapper[13046]: E0308 03:33:15.605561 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs podName:72cc246d-ba12-4435-90fb-e8a0c307bb48 nodeName:}" failed. No retries permitted until 2026-03-08 03:33:16.105554327 +0000 UTC m=+1198.184321544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs") pod "speaker-z8zzn" (UID: "72cc246d-ba12-4435-90fb-e8a0c307bb48") : secret "speaker-certs-secret" not found Mar 08 03:33:15.606576 master-0 kubenswrapper[13046]: I0308 03:33:15.606249 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/72cc246d-ba12-4435-90fb-e8a0c307bb48-metallb-excludel2\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.609815 master-0 kubenswrapper[13046]: I0308 03:33:15.609771 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-cert\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.614344 master-0 kubenswrapper[13046]: I0308 03:33:15.614302 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8655289-e199-48db-be5c-78f68514a515-metrics-certs\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.630897 master-0 kubenswrapper[13046]: I0308 03:33:15.630833 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sscdm\" (UniqueName: \"kubernetes.io/projected/72cc246d-ba12-4435-90fb-e8a0c307bb48-kube-api-access-sscdm\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:15.632498 master-0 kubenswrapper[13046]: I0308 03:33:15.632440 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndlz\" (UniqueName: \"kubernetes.io/projected/d8655289-e199-48db-be5c-78f68514a515-kube-api-access-8ndlz\") pod \"controller-86ddb6bd46-n8p77\" (UID: \"d8655289-e199-48db-be5c-78f68514a515\") " pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.646223 master-0 kubenswrapper[13046]: I0308 03:33:15.646151 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:15.765147 master-0 kubenswrapper[13046]: I0308 03:33:15.765081 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:15.873146 master-0 kubenswrapper[13046]: I0308 03:33:15.873080 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"678d672baa50e01457fd05559b42a316bbd9422cd7e22893303bb05e5cdc3d0e"} Mar 08 03:33:16.019137 master-0 kubenswrapper[13046]: I0308 03:33:16.019084 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-546wf"] Mar 08 03:33:16.024024 master-0 kubenswrapper[13046]: W0308 03:33:16.023977 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b5bd505_a6ef_490d_b7b4_83412df76a4f.slice/crio-9c838d2e5eb09da65e291eb61e8c15694d922ce0a83bc3ec30a315152f84804c WatchSource:0}: Error finding container 9c838d2e5eb09da65e291eb61e8c15694d922ce0a83bc3ec30a315152f84804c: Status 404 returned error can't find the container with id 9c838d2e5eb09da65e291eb61e8c15694d922ce0a83bc3ec30a315152f84804c Mar 08 03:33:16.116320 master-0 kubenswrapper[13046]: I0308 03:33:16.116221 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:16.117647 master-0 kubenswrapper[13046]: E0308 03:33:16.116449 13046 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 03:33:16.117647 master-0 kubenswrapper[13046]: I0308 03:33:16.116528 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:16.117647 master-0 kubenswrapper[13046]: E0308 03:33:16.116629 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist podName:72cc246d-ba12-4435-90fb-e8a0c307bb48 nodeName:}" failed. No retries permitted until 2026-03-08 03:33:17.116591014 +0000 UTC m=+1199.195358281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist") pod "speaker-z8zzn" (UID: "72cc246d-ba12-4435-90fb-e8a0c307bb48") : secret "metallb-memberlist" not found Mar 08 03:33:16.123449 master-0 kubenswrapper[13046]: I0308 03:33:16.121250 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-metrics-certs\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:16.271848 master-0 kubenswrapper[13046]: I0308 03:33:16.271368 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-n8p77"] Mar 08 03:33:16.886126 master-0 kubenswrapper[13046]: I0308 03:33:16.886047 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8p77" event={"ID":"d8655289-e199-48db-be5c-78f68514a515","Type":"ContainerStarted","Data":"a6214313e0a16a64399c9d1e2292eb13699660fa031300f31a970ffe035ebfac"} Mar 08 03:33:16.886126 master-0 kubenswrapper[13046]: I0308 03:33:16.886117 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8p77" event={"ID":"d8655289-e199-48db-be5c-78f68514a515","Type":"ContainerStarted","Data":"f9bb72502403845e3593e0ed8af27f4b119d2160250967392b2f1ad87d53a46a"} Mar 08 03:33:16.887413 master-0 kubenswrapper[13046]: I0308 03:33:16.887357 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" event={"ID":"2b5bd505-a6ef-490d-b7b4-83412df76a4f","Type":"ContainerStarted","Data":"9c838d2e5eb09da65e291eb61e8c15694d922ce0a83bc3ec30a315152f84804c"} Mar 08 03:33:17.137513 master-0 kubenswrapper[13046]: I0308 03:33:17.137376 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:17.140545 master-0 kubenswrapper[13046]: I0308 03:33:17.140440 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/72cc246d-ba12-4435-90fb-e8a0c307bb48-memberlist\") pod \"speaker-z8zzn\" (UID: \"72cc246d-ba12-4435-90fb-e8a0c307bb48\") " pod="metallb-system/speaker-z8zzn" Mar 08 03:33:17.226293 master-0 kubenswrapper[13046]: I0308 03:33:17.226217 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z8zzn" Mar 08 03:33:17.243692 master-0 kubenswrapper[13046]: I0308 03:33:17.243632 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-7r4s5"] Mar 08 03:33:17.246238 master-0 kubenswrapper[13046]: I0308 03:33:17.246194 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" Mar 08 03:33:17.248402 master-0 kubenswrapper[13046]: W0308 03:33:17.248348 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72cc246d_ba12_4435_90fb_e8a0c307bb48.slice/crio-2d6c747cf551a2ee9a6c1ce058a67e6579a1db0fbfbc29f08b9fecb0784efd33 WatchSource:0}: Error finding container 2d6c747cf551a2ee9a6c1ce058a67e6579a1db0fbfbc29f08b9fecb0784efd33: Status 404 returned error can't find the container with id 2d6c747cf551a2ee9a6c1ce058a67e6579a1db0fbfbc29f08b9fecb0784efd33 Mar 08 03:33:17.274715 master-0 kubenswrapper[13046]: I0308 03:33:17.258715 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx"] Mar 08 03:33:17.274715 master-0 kubenswrapper[13046]: I0308 03:33:17.259961 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.274715 master-0 kubenswrapper[13046]: I0308 03:33:17.261947 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 03:33:17.274715 master-0 kubenswrapper[13046]: I0308 03:33:17.272722 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-7r4s5"] Mar 08 03:33:17.283081 master-0 kubenswrapper[13046]: I0308 03:33:17.283027 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rn6cd"] Mar 08 03:33:17.284915 master-0 kubenswrapper[13046]: I0308 03:33:17.284879 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.332284 master-0 kubenswrapper[13046]: I0308 03:33:17.330845 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx"] Mar 08 03:33:17.341498 master-0 kubenswrapper[13046]: I0308 03:33:17.341401 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tszj\" (UniqueName: \"kubernetes.io/projected/e961965c-77d6-4dd8-b731-ecbdd4ef035d-kube-api-access-6tszj\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.341737 master-0 kubenswrapper[13046]: I0308 03:33:17.341559 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92wd\" (UniqueName: \"kubernetes.io/projected/1ca2dfb7-04ed-4252-a024-0287ba87ff9f-kube-api-access-m92wd\") pod \"nmstate-metrics-69594cc75-7r4s5\" (UID: \"1ca2dfb7-04ed-4252-a024-0287ba87ff9f\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" Mar 08 03:33:17.341737 master-0 kubenswrapper[13046]: I0308 03:33:17.341596 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.422592 master-0 kubenswrapper[13046]: I0308 03:33:17.422455 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9"] Mar 08 03:33:17.424046 master-0 kubenswrapper[13046]: I0308 03:33:17.424021 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.427837 master-0 kubenswrapper[13046]: I0308 03:33:17.427782 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 03:33:17.427987 master-0 kubenswrapper[13046]: I0308 03:33:17.427959 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 03:33:17.429141 master-0 kubenswrapper[13046]: I0308 03:33:17.429099 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9"] Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.442990 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: E0308 03:33:17.443154 13046 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: E0308 03:33:17.443238 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair podName:e961965c-77d6-4dd8-b731-ecbdd4ef035d nodeName:}" failed. No retries permitted until 2026-03-08 03:33:17.943219743 +0000 UTC m=+1200.021986960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair") pod "nmstate-webhook-786f45cff4-gpxzx" (UID: "e961965c-77d6-4dd8-b731-ecbdd4ef035d") : secret "openshift-nmstate-webhook" not found Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443446 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-dbus-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443767 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tszj\" (UniqueName: \"kubernetes.io/projected/e961965c-77d6-4dd8-b731-ecbdd4ef035d-kube-api-access-6tszj\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443799 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-nmstate-lock\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443916 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-ovs-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443935 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtvn\" (UniqueName: \"kubernetes.io/projected/5532eb26-c3d4-40a9-a0d8-3794569ef44b-kube-api-access-djtvn\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.446624 master-0 kubenswrapper[13046]: I0308 03:33:17.443998 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92wd\" (UniqueName: \"kubernetes.io/projected/1ca2dfb7-04ed-4252-a024-0287ba87ff9f-kube-api-access-m92wd\") pod \"nmstate-metrics-69594cc75-7r4s5\" (UID: \"1ca2dfb7-04ed-4252-a024-0287ba87ff9f\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" Mar 08 03:33:17.469943 master-0 kubenswrapper[13046]: I0308 03:33:17.463699 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tszj\" (UniqueName: \"kubernetes.io/projected/e961965c-77d6-4dd8-b731-ecbdd4ef035d-kube-api-access-6tszj\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.471267 master-0 kubenswrapper[13046]: I0308 03:33:17.471110 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92wd\" (UniqueName: \"kubernetes.io/projected/1ca2dfb7-04ed-4252-a024-0287ba87ff9f-kube-api-access-m92wd\") pod \"nmstate-metrics-69594cc75-7r4s5\" (UID: \"1ca2dfb7-04ed-4252-a024-0287ba87ff9f\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.546922 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-ovs-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.546967 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtvn\" (UniqueName: \"kubernetes.io/projected/5532eb26-c3d4-40a9-a0d8-3794569ef44b-kube-api-access-djtvn\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.546997 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b3fc40be-0506-4106-86f1-4ea0b3a66734-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547054 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-dbus-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547092 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpnqg\" (UniqueName: \"kubernetes.io/projected/b3fc40be-0506-4106-86f1-4ea0b3a66734-kube-api-access-kpnqg\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547133 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547161 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-nmstate-lock\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547228 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-nmstate-lock\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547283 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-dbus-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.548798 master-0 kubenswrapper[13046]: I0308 03:33:17.547703 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5532eb26-c3d4-40a9-a0d8-3794569ef44b-ovs-socket\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.581737 master-0 kubenswrapper[13046]: I0308 03:33:17.581686 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtvn\" (UniqueName: \"kubernetes.io/projected/5532eb26-c3d4-40a9-a0d8-3794569ef44b-kube-api-access-djtvn\") pod \"nmstate-handler-rn6cd\" (UID: \"5532eb26-c3d4-40a9-a0d8-3794569ef44b\") " pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.620856 master-0 kubenswrapper[13046]: I0308 03:33:17.620810 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f654c497d-dwcp2"] Mar 08 03:33:17.622800 master-0 kubenswrapper[13046]: I0308 03:33:17.622769 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.629477 master-0 kubenswrapper[13046]: I0308 03:33:17.629428 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" Mar 08 03:33:17.645092 master-0 kubenswrapper[13046]: I0308 03:33:17.644835 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f654c497d-dwcp2"] Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: I0308 03:33:17.654735 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpnqg\" (UniqueName: \"kubernetes.io/projected/b3fc40be-0506-4106-86f1-4ea0b3a66734-kube-api-access-kpnqg\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: I0308 03:33:17.654797 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: I0308 03:33:17.654853 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b3fc40be-0506-4106-86f1-4ea0b3a66734-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: I0308 03:33:17.655659 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b3fc40be-0506-4106-86f1-4ea0b3a66734-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: E0308 03:33:17.655909 13046 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 08 03:33:17.657458 master-0 kubenswrapper[13046]: E0308 03:33:17.655942 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert podName:b3fc40be-0506-4106-86f1-4ea0b3a66734 nodeName:}" failed. No retries permitted until 2026-03-08 03:33:18.1559309 +0000 UTC m=+1200.234698117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-gkhw9" (UID: "b3fc40be-0506-4106-86f1-4ea0b3a66734") : secret "plugin-serving-cert" not found Mar 08 03:33:17.664254 master-0 kubenswrapper[13046]: I0308 03:33:17.664192 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:17.677262 master-0 kubenswrapper[13046]: I0308 03:33:17.677189 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpnqg\" (UniqueName: \"kubernetes.io/projected/b3fc40be-0506-4106-86f1-4ea0b3a66734-kube-api-access-kpnqg\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757549 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv5mc\" (UniqueName: \"kubernetes.io/projected/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-kube-api-access-vv5mc\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757669 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757700 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-oauth-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757747 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757837 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-trusted-ca-bundle\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757887 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-oauth-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.761230 master-0 kubenswrapper[13046]: I0308 03:33:17.757979 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-service-ca\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860006 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860153 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-trusted-ca-bundle\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860193 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-oauth-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860246 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-service-ca\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860333 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv5mc\" (UniqueName: \"kubernetes.io/projected/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-kube-api-access-vv5mc\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860573 master-0 kubenswrapper[13046]: I0308 03:33:17.860388 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.860956 master-0 kubenswrapper[13046]: I0308 03:33:17.860929 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-oauth-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.864581 master-0 kubenswrapper[13046]: I0308 03:33:17.861406 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-oauth-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.864581 master-0 kubenswrapper[13046]: I0308 03:33:17.862121 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.864581 master-0 kubenswrapper[13046]: I0308 03:33:17.862752 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-service-ca\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.864581 master-0 kubenswrapper[13046]: I0308 03:33:17.863764 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-trusted-ca-bundle\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.864581 master-0 kubenswrapper[13046]: I0308 03:33:17.863993 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-serving-cert\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.868732 master-0 kubenswrapper[13046]: I0308 03:33:17.868696 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-console-oauth-config\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.877315 master-0 kubenswrapper[13046]: I0308 03:33:17.877281 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv5mc\" (UniqueName: \"kubernetes.io/projected/42c1884e-96c1-46ee-a5dc-2267c7d84e2a-kube-api-access-vv5mc\") pod \"console-5f654c497d-dwcp2\" (UID: \"42c1884e-96c1-46ee-a5dc-2267c7d84e2a\") " pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:17.897039 master-0 kubenswrapper[13046]: I0308 03:33:17.896991 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8zzn" event={"ID":"72cc246d-ba12-4435-90fb-e8a0c307bb48","Type":"ContainerStarted","Data":"da9e3ebb19e3589b3077e6507f88f3ccbd0da7b3644502c0b1965ccd75c0fdf4"} Mar 08 03:33:17.897039 master-0 kubenswrapper[13046]: I0308 03:33:17.897042 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8zzn" event={"ID":"72cc246d-ba12-4435-90fb-e8a0c307bb48","Type":"ContainerStarted","Data":"2d6c747cf551a2ee9a6c1ce058a67e6579a1db0fbfbc29f08b9fecb0784efd33"} Mar 08 03:33:17.898170 master-0 kubenswrapper[13046]: I0308 03:33:17.898150 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rn6cd" event={"ID":"5532eb26-c3d4-40a9-a0d8-3794569ef44b","Type":"ContainerStarted","Data":"d41a6e9cc719f144ed33c9ed8c2a77722c9ca329403ca55bc3135600ac45d54e"} Mar 08 03:33:17.899991 master-0 kubenswrapper[13046]: I0308 03:33:17.899966 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-n8p77" event={"ID":"d8655289-e199-48db-be5c-78f68514a515","Type":"ContainerStarted","Data":"18abc61ed8882e6439b27fda9daed668d2ba71b25239653914826d0769710440"} Mar 08 03:33:17.900532 master-0 kubenswrapper[13046]: I0308 03:33:17.900212 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:17.925549 master-0 kubenswrapper[13046]: I0308 03:33:17.923267 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-n8p77" podStartSLOduration=1.642730198 podStartE2EDuration="2.923251143s" podCreationTimestamp="2026-03-08 03:33:15 +0000 UTC" firstStartedPulling="2026-03-08 03:33:16.443850432 +0000 UTC m=+1198.522617639" lastFinishedPulling="2026-03-08 03:33:17.724371367 +0000 UTC m=+1199.803138584" observedRunningTime="2026-03-08 03:33:17.920721911 +0000 UTC m=+1199.999489128" watchObservedRunningTime="2026-03-08 03:33:17.923251143 +0000 UTC m=+1200.002018360" Mar 08 03:33:17.962731 master-0 kubenswrapper[13046]: I0308 03:33:17.962617 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:17.965811 master-0 kubenswrapper[13046]: I0308 03:33:17.965760 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e961965c-77d6-4dd8-b731-ecbdd4ef035d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-gpxzx\" (UID: \"e961965c-77d6-4dd8-b731-ecbdd4ef035d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:18.046834 master-0 kubenswrapper[13046]: I0308 03:33:18.046375 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:18.073141 master-0 kubenswrapper[13046]: I0308 03:33:18.073105 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-7r4s5"] Mar 08 03:33:18.074464 master-0 kubenswrapper[13046]: W0308 03:33:18.074427 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca2dfb7_04ed_4252_a024_0287ba87ff9f.slice/crio-e69350f2dace6af271552392ebf520b4fb27a9456c767b5d9185665953ce54ea WatchSource:0}: Error finding container e69350f2dace6af271552392ebf520b4fb27a9456c767b5d9185665953ce54ea: Status 404 returned error can't find the container with id e69350f2dace6af271552392ebf520b4fb27a9456c767b5d9185665953ce54ea Mar 08 03:33:18.166053 master-0 kubenswrapper[13046]: I0308 03:33:18.165863 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:18.170498 master-0 kubenswrapper[13046]: I0308 03:33:18.169863 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3fc40be-0506-4106-86f1-4ea0b3a66734-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-gkhw9\" (UID: \"b3fc40be-0506-4106-86f1-4ea0b3a66734\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:18.259751 master-0 kubenswrapper[13046]: I0308 03:33:18.259644 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:18.349129 master-0 kubenswrapper[13046]: I0308 03:33:18.345028 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" Mar 08 03:33:18.501848 master-0 kubenswrapper[13046]: I0308 03:33:18.501812 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f654c497d-dwcp2"] Mar 08 03:33:18.726651 master-0 kubenswrapper[13046]: I0308 03:33:18.726433 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9"] Mar 08 03:33:18.797845 master-0 kubenswrapper[13046]: I0308 03:33:18.796690 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx"] Mar 08 03:33:18.911929 master-0 kubenswrapper[13046]: I0308 03:33:18.911872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" event={"ID":"b3fc40be-0506-4106-86f1-4ea0b3a66734","Type":"ContainerStarted","Data":"0c10e980cc5fc69f28affaaaf76eda814d3455f7207414dab8e075f0792e90da"} Mar 08 03:33:18.913592 master-0 kubenswrapper[13046]: I0308 03:33:18.913547 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" event={"ID":"e961965c-77d6-4dd8-b731-ecbdd4ef035d","Type":"ContainerStarted","Data":"4c31f16deebe7c47679e30c2a66cad0e7d0c5ffc3ce4a2ebfd03418ec33ff644"} Mar 08 03:33:18.915333 master-0 kubenswrapper[13046]: I0308 03:33:18.915307 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" event={"ID":"1ca2dfb7-04ed-4252-a024-0287ba87ff9f","Type":"ContainerStarted","Data":"e69350f2dace6af271552392ebf520b4fb27a9456c767b5d9185665953ce54ea"} Mar 08 03:33:18.918737 master-0 kubenswrapper[13046]: I0308 03:33:18.918711 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f654c497d-dwcp2" event={"ID":"42c1884e-96c1-46ee-a5dc-2267c7d84e2a","Type":"ContainerStarted","Data":"c9af8bb4b5216d1cefbeb3f44ab0d3a5f96828f7ca87fa2d874928f15bc9eb6d"} Mar 08 03:33:18.918797 master-0 kubenswrapper[13046]: I0308 03:33:18.918741 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f654c497d-dwcp2" event={"ID":"42c1884e-96c1-46ee-a5dc-2267c7d84e2a","Type":"ContainerStarted","Data":"efcf6e2d7a5259b84e3473cc607bf7350e967e317854ce7c7cb8cb99db2e9132"} Mar 08 03:33:18.942224 master-0 kubenswrapper[13046]: I0308 03:33:18.942138 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f654c497d-dwcp2" podStartSLOduration=1.9421156750000002 podStartE2EDuration="1.942115675s" podCreationTimestamp="2026-03-08 03:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:33:18.936737803 +0000 UTC m=+1201.015505040" watchObservedRunningTime="2026-03-08 03:33:18.942115675 +0000 UTC m=+1201.020882892" Mar 08 03:33:19.936280 master-0 kubenswrapper[13046]: I0308 03:33:19.935752 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8zzn" event={"ID":"72cc246d-ba12-4435-90fb-e8a0c307bb48","Type":"ContainerStarted","Data":"c279e281e3251d9b971096b09a7093a4669ee26fc6504a5ed303e6fa36a0e6d8"} Mar 08 03:33:19.936836 master-0 kubenswrapper[13046]: I0308 03:33:19.936413 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-z8zzn" Mar 08 03:33:19.958348 master-0 kubenswrapper[13046]: I0308 03:33:19.958247 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-z8zzn" podStartSLOduration=3.547954014 podStartE2EDuration="4.958228529s" podCreationTimestamp="2026-03-08 03:33:15 +0000 UTC" firstStartedPulling="2026-03-08 03:33:17.621331062 +0000 UTC m=+1199.700098279" lastFinishedPulling="2026-03-08 03:33:19.031605587 +0000 UTC m=+1201.110372794" observedRunningTime="2026-03-08 03:33:19.954673889 +0000 UTC m=+1202.033441106" watchObservedRunningTime="2026-03-08 03:33:19.958228529 +0000 UTC m=+1202.036995736" Mar 08 03:33:27.007402 master-0 kubenswrapper[13046]: I0308 03:33:27.007173 13046 generic.go:334] "Generic (PLEG): container finished" podID="202b3558-b98e-401f-9c22-529f5a27dd5b" containerID="39b848a491aba424636da76c968ce423319a29d9531fa6a7b909d1a742426293" exitCode=0 Mar 08 03:33:27.007402 master-0 kubenswrapper[13046]: I0308 03:33:27.007304 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerDied","Data":"39b848a491aba424636da76c968ce423319a29d9531fa6a7b909d1a742426293"} Mar 08 03:33:27.009744 master-0 kubenswrapper[13046]: I0308 03:33:27.009499 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" event={"ID":"e961965c-77d6-4dd8-b731-ecbdd4ef035d","Type":"ContainerStarted","Data":"fe15639d231c3fc1b0e404cf3074397e372d35820bb7631f429461dc3d813a29"} Mar 08 03:33:27.009744 master-0 kubenswrapper[13046]: I0308 03:33:27.009587 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:27.013063 master-0 kubenswrapper[13046]: I0308 03:33:27.012667 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" event={"ID":"1ca2dfb7-04ed-4252-a024-0287ba87ff9f","Type":"ContainerStarted","Data":"81ce16d1b13f1332700ea6f9fdebb8da7a3c1cab6de459784efb692fdae446cd"} Mar 08 03:33:27.013063 master-0 kubenswrapper[13046]: I0308 03:33:27.012811 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" event={"ID":"1ca2dfb7-04ed-4252-a024-0287ba87ff9f","Type":"ContainerStarted","Data":"fc87f3379bb162e481b4ab56ec80892fd380654e3b4c1fe3e286ea78b86fa6e8"} Mar 08 03:33:27.015582 master-0 kubenswrapper[13046]: I0308 03:33:27.015548 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" event={"ID":"b3fc40be-0506-4106-86f1-4ea0b3a66734","Type":"ContainerStarted","Data":"8a6348dafca762f19ec7a62165aa9279c5f70199a6c88e234c52d61417890ee5"} Mar 08 03:33:27.019619 master-0 kubenswrapper[13046]: I0308 03:33:27.017522 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rn6cd" event={"ID":"5532eb26-c3d4-40a9-a0d8-3794569ef44b","Type":"ContainerStarted","Data":"bc4a55bab0cf7254728277c853d3fa6422512fb0f56d77eac56bb0eb0388f6f5"} Mar 08 03:33:27.019619 master-0 kubenswrapper[13046]: I0308 03:33:27.017594 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:27.071653 master-0 kubenswrapper[13046]: I0308 03:33:27.071419 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-gkhw9" podStartSLOduration=2.74367359 podStartE2EDuration="10.071393543s" podCreationTimestamp="2026-03-08 03:33:17 +0000 UTC" firstStartedPulling="2026-03-08 03:33:18.692237197 +0000 UTC m=+1200.771004414" lastFinishedPulling="2026-03-08 03:33:26.01995711 +0000 UTC m=+1208.098724367" observedRunningTime="2026-03-08 03:33:27.067033129 +0000 UTC m=+1209.145800346" watchObservedRunningTime="2026-03-08 03:33:27.071393543 +0000 UTC m=+1209.150160760" Mar 08 03:33:27.134887 master-0 kubenswrapper[13046]: I0308 03:33:27.134820 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rn6cd" podStartSLOduration=1.870731475 podStartE2EDuration="10.134799236s" podCreationTimestamp="2026-03-08 03:33:17 +0000 UTC" firstStartedPulling="2026-03-08 03:33:17.751859144 +0000 UTC m=+1199.830626361" lastFinishedPulling="2026-03-08 03:33:26.015926905 +0000 UTC m=+1208.094694122" observedRunningTime="2026-03-08 03:33:27.130298259 +0000 UTC m=+1209.209065486" watchObservedRunningTime="2026-03-08 03:33:27.134799236 +0000 UTC m=+1209.213566463" Mar 08 03:33:27.167114 master-0 kubenswrapper[13046]: I0308 03:33:27.166888 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" podStartSLOduration=2.922069857 podStartE2EDuration="10.166867734s" podCreationTimestamp="2026-03-08 03:33:17 +0000 UTC" firstStartedPulling="2026-03-08 03:33:18.814561017 +0000 UTC m=+1200.893328234" lastFinishedPulling="2026-03-08 03:33:26.059358894 +0000 UTC m=+1208.138126111" observedRunningTime="2026-03-08 03:33:27.16567782 +0000 UTC m=+1209.244445037" watchObservedRunningTime="2026-03-08 03:33:27.166867734 +0000 UTC m=+1209.245634961" Mar 08 03:33:27.202179 master-0 kubenswrapper[13046]: I0308 03:33:27.200974 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-7r4s5" podStartSLOduration=2.259454712 podStartE2EDuration="10.200957558s" podCreationTimestamp="2026-03-08 03:33:17 +0000 UTC" firstStartedPulling="2026-03-08 03:33:18.076276902 +0000 UTC m=+1200.155044119" lastFinishedPulling="2026-03-08 03:33:26.017779708 +0000 UTC m=+1208.096546965" observedRunningTime="2026-03-08 03:33:27.19005955 +0000 UTC m=+1209.268826797" watchObservedRunningTime="2026-03-08 03:33:27.200957558 +0000 UTC m=+1209.279724775" Mar 08 03:33:27.232540 master-0 kubenswrapper[13046]: I0308 03:33:27.230496 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-z8zzn" Mar 08 03:33:28.026515 master-0 kubenswrapper[13046]: I0308 03:33:28.026382 13046 generic.go:334] "Generic (PLEG): container finished" podID="202b3558-b98e-401f-9c22-529f5a27dd5b" containerID="b5017bbaa7cd1c688221356bcb22363d4a84c4a8724a28b9a879171591b0273e" exitCode=0 Mar 08 03:33:28.026515 master-0 kubenswrapper[13046]: I0308 03:33:28.026461 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerDied","Data":"b5017bbaa7cd1c688221356bcb22363d4a84c4a8724a28b9a879171591b0273e"} Mar 08 03:33:28.033134 master-0 kubenswrapper[13046]: I0308 03:33:28.033081 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" event={"ID":"2b5bd505-a6ef-490d-b7b4-83412df76a4f","Type":"ContainerStarted","Data":"2561f61bf875c0687a26d2cb820b2052c6d7673beebbdbd4a2d3952a399b2b0f"} Mar 08 03:33:28.033134 master-0 kubenswrapper[13046]: I0308 03:33:28.033127 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:28.047813 master-0 kubenswrapper[13046]: I0308 03:33:28.047754 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:28.047813 master-0 kubenswrapper[13046]: I0308 03:33:28.047799 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:28.052269 master-0 kubenswrapper[13046]: I0308 03:33:28.052221 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:28.096266 master-0 kubenswrapper[13046]: I0308 03:33:28.096031 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" podStartSLOduration=1.249187145 podStartE2EDuration="13.095994538s" podCreationTimestamp="2026-03-08 03:33:15 +0000 UTC" firstStartedPulling="2026-03-08 03:33:16.027619347 +0000 UTC m=+1198.106386564" lastFinishedPulling="2026-03-08 03:33:27.87442674 +0000 UTC m=+1209.953193957" observedRunningTime="2026-03-08 03:33:28.084277186 +0000 UTC m=+1210.163044413" watchObservedRunningTime="2026-03-08 03:33:28.095994538 +0000 UTC m=+1210.174761755" Mar 08 03:33:29.059301 master-0 kubenswrapper[13046]: I0308 03:33:29.059234 13046 generic.go:334] "Generic (PLEG): container finished" podID="202b3558-b98e-401f-9c22-529f5a27dd5b" containerID="55babdc6e941cd008701b7d0b12196b65bfe8de383e4d21728034975b4e595f1" exitCode=0 Mar 08 03:33:29.059961 master-0 kubenswrapper[13046]: I0308 03:33:29.059306 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerDied","Data":"55babdc6e941cd008701b7d0b12196b65bfe8de383e4d21728034975b4e595f1"} Mar 08 03:33:29.065851 master-0 kubenswrapper[13046]: I0308 03:33:29.065792 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f654c497d-dwcp2" Mar 08 03:33:29.187134 master-0 kubenswrapper[13046]: I0308 03:33:29.187075 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:33:30.080981 master-0 kubenswrapper[13046]: I0308 03:33:30.080791 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"73d43b6a391c4c8d0928952c73d1a73155cf1a773116f0e51257d0eee8e96190"} Mar 08 03:33:30.080981 master-0 kubenswrapper[13046]: I0308 03:33:30.080864 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"eac637e1099040cdde3d4b0d7c4559b27169a296486feae4c652c6f9932377cb"} Mar 08 03:33:30.080981 master-0 kubenswrapper[13046]: I0308 03:33:30.080885 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"9a84839c20e81487c6128f0166f5017ed6cd5b0cffd849b6c2cba4fc86818571"} Mar 08 03:33:30.080981 master-0 kubenswrapper[13046]: I0308 03:33:30.080904 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"2b3afc65df437dc6665579e3d41aaf2a24602513711f09a88ae68d5c7add6f86"} Mar 08 03:33:30.080981 master-0 kubenswrapper[13046]: I0308 03:33:30.080918 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"0530c6b2719d1154ab3fb6a1d01bfc33007a270648d944871b5833dd91278018"} Mar 08 03:33:31.103577 master-0 kubenswrapper[13046]: I0308 03:33:31.103188 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7jbsp" event={"ID":"202b3558-b98e-401f-9c22-529f5a27dd5b","Type":"ContainerStarted","Data":"381668cdf62dd35d515dabefe052ae67910d4e58108128ac79fbeec653c0e71a"} Mar 08 03:33:31.104990 master-0 kubenswrapper[13046]: I0308 03:33:31.104246 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:31.164632 master-0 kubenswrapper[13046]: I0308 03:33:31.163023 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7jbsp" podStartSLOduration=5.875819736 podStartE2EDuration="16.162996439s" podCreationTimestamp="2026-03-08 03:33:15 +0000 UTC" firstStartedPulling="2026-03-08 03:33:15.783677176 +0000 UTC m=+1197.862444393" lastFinishedPulling="2026-03-08 03:33:26.070853869 +0000 UTC m=+1208.149621096" observedRunningTime="2026-03-08 03:33:31.153183112 +0000 UTC m=+1213.231950369" watchObservedRunningTime="2026-03-08 03:33:31.162996439 +0000 UTC m=+1213.241763686" Mar 08 03:33:32.709108 master-0 kubenswrapper[13046]: I0308 03:33:32.709053 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rn6cd" Mar 08 03:33:35.647539 master-0 kubenswrapper[13046]: I0308 03:33:35.647441 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:35.719241 master-0 kubenswrapper[13046]: I0308 03:33:35.719164 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:35.773087 master-0 kubenswrapper[13046]: I0308 03:33:35.772557 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-n8p77" Mar 08 03:33:38.269712 master-0 kubenswrapper[13046]: I0308 03:33:38.269610 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-gpxzx" Mar 08 03:33:43.099250 master-0 kubenswrapper[13046]: I0308 03:33:43.099171 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-9dc9r"] Mar 08 03:33:43.100406 master-0 kubenswrapper[13046]: I0308 03:33:43.100361 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.102580 master-0 kubenswrapper[13046]: I0308 03:33:43.102531 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 08 03:33:43.129715 master-0 kubenswrapper[13046]: I0308 03:33:43.126922 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-9dc9r"] Mar 08 03:33:43.175300 master-0 kubenswrapper[13046]: I0308 03:33:43.175205 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-csi-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175611 master-0 kubenswrapper[13046]: I0308 03:33:43.175325 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-lvmd-config\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175611 master-0 kubenswrapper[13046]: I0308 03:33:43.175453 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-pod-volumes-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175611 master-0 kubenswrapper[13046]: I0308 03:33:43.175586 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-registration-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175833 master-0 kubenswrapper[13046]: I0308 03:33:43.175654 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-run-udev\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175833 master-0 kubenswrapper[13046]: I0308 03:33:43.175690 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-file-lock-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175962 master-0 kubenswrapper[13046]: I0308 03:33:43.175857 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-device-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.175962 master-0 kubenswrapper[13046]: I0308 03:33:43.175938 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrkn\" (UniqueName: \"kubernetes.io/projected/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-kube-api-access-dgrkn\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.176090 master-0 kubenswrapper[13046]: I0308 03:33:43.175984 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-metrics-cert\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.176090 master-0 kubenswrapper[13046]: I0308 03:33:43.176049 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-node-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.176318 master-0 kubenswrapper[13046]: I0308 03:33:43.176248 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-sys\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277189 master-0 kubenswrapper[13046]: I0308 03:33:43.277084 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-run-udev\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277463 master-0 kubenswrapper[13046]: I0308 03:33:43.277212 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-run-udev\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277463 master-0 kubenswrapper[13046]: I0308 03:33:43.277292 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-file-lock-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277463 master-0 kubenswrapper[13046]: I0308 03:33:43.277355 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-device-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277463 master-0 kubenswrapper[13046]: I0308 03:33:43.277441 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-device-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277463 master-0 kubenswrapper[13046]: I0308 03:33:43.277442 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrkn\" (UniqueName: \"kubernetes.io/projected/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-kube-api-access-dgrkn\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277501 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-metrics-cert\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277548 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-node-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277626 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-sys\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277665 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-csi-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277692 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-lvmd-config\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277702 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-file-lock-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277731 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-pod-volumes-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277771 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-registration-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277791 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-pod-volumes-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.277850 master-0 kubenswrapper[13046]: I0308 03:33:43.277810 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-sys\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.278557 master-0 kubenswrapper[13046]: I0308 03:33:43.278242 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-csi-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.278557 master-0 kubenswrapper[13046]: I0308 03:33:43.278403 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-node-plugin-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.278557 master-0 kubenswrapper[13046]: I0308 03:33:43.278543 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-lvmd-config\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.278784 master-0 kubenswrapper[13046]: I0308 03:33:43.278628 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-registration-dir\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.292445 master-0 kubenswrapper[13046]: I0308 03:33:43.284148 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-metrics-cert\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.294592 master-0 kubenswrapper[13046]: I0308 03:33:43.293984 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrkn\" (UniqueName: \"kubernetes.io/projected/fe641585-1cd1-4ba3-aad7-cd111f7e6b6a-kube-api-access-dgrkn\") pod \"vg-manager-9dc9r\" (UID: \"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a\") " pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.445054 master-0 kubenswrapper[13046]: I0308 03:33:43.444856 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:43.938174 master-0 kubenswrapper[13046]: W0308 03:33:43.938106 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe641585_1cd1_4ba3_aad7_cd111f7e6b6a.slice/crio-e8bb97db1df9ee4fcc8a0147bd0e001ea634964c23c4a27d0d6ac85ad964f183 WatchSource:0}: Error finding container e8bb97db1df9ee4fcc8a0147bd0e001ea634964c23c4a27d0d6ac85ad964f183: Status 404 returned error can't find the container with id e8bb97db1df9ee4fcc8a0147bd0e001ea634964c23c4a27d0d6ac85ad964f183 Mar 08 03:33:43.955508 master-0 kubenswrapper[13046]: I0308 03:33:43.954846 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-9dc9r"] Mar 08 03:33:44.262562 master-0 kubenswrapper[13046]: I0308 03:33:44.262410 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9dc9r" event={"ID":"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a","Type":"ContainerStarted","Data":"208ee69fee03fd159d6a7e352d64c54d49dc658334ef79be11360498fb777a3c"} Mar 08 03:33:44.262562 master-0 kubenswrapper[13046]: I0308 03:33:44.262555 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9dc9r" event={"ID":"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a","Type":"ContainerStarted","Data":"e8bb97db1df9ee4fcc8a0147bd0e001ea634964c23c4a27d0d6ac85ad964f183"} Mar 08 03:33:44.296501 master-0 kubenswrapper[13046]: I0308 03:33:44.296320 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-9dc9r" podStartSLOduration=1.296290646 podStartE2EDuration="1.296290646s" podCreationTimestamp="2026-03-08 03:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:33:44.292249302 +0000 UTC m=+1226.371016529" watchObservedRunningTime="2026-03-08 03:33:44.296290646 +0000 UTC m=+1226.375057873" Mar 08 03:33:45.601971 master-0 kubenswrapper[13046]: I0308 03:33:45.601923 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-546wf" Mar 08 03:33:45.650537 master-0 kubenswrapper[13046]: I0308 03:33:45.650400 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7jbsp" Mar 08 03:33:46.297105 master-0 kubenswrapper[13046]: I0308 03:33:46.296951 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-9dc9r_fe641585-1cd1-4ba3-aad7-cd111f7e6b6a/vg-manager/0.log" Mar 08 03:33:46.297105 master-0 kubenswrapper[13046]: I0308 03:33:46.297033 13046 generic.go:334] "Generic (PLEG): container finished" podID="fe641585-1cd1-4ba3-aad7-cd111f7e6b6a" containerID="208ee69fee03fd159d6a7e352d64c54d49dc658334ef79be11360498fb777a3c" exitCode=1 Mar 08 03:33:46.297105 master-0 kubenswrapper[13046]: I0308 03:33:46.297083 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9dc9r" event={"ID":"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a","Type":"ContainerDied","Data":"208ee69fee03fd159d6a7e352d64c54d49dc658334ef79be11360498fb777a3c"} Mar 08 03:33:46.298098 master-0 kubenswrapper[13046]: I0308 03:33:46.298039 13046 scope.go:117] "RemoveContainer" containerID="208ee69fee03fd159d6a7e352d64c54d49dc658334ef79be11360498fb777a3c" Mar 08 03:33:46.718879 master-0 kubenswrapper[13046]: I0308 03:33:46.718786 13046 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 08 03:33:47.026781 master-0 kubenswrapper[13046]: I0308 03:33:47.026347 13046 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-08T03:33:46.718825537Z","Handler":null,"Name":""} Mar 08 03:33:47.029101 master-0 kubenswrapper[13046]: I0308 03:33:47.029078 13046 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 08 03:33:47.029228 master-0 kubenswrapper[13046]: I0308 03:33:47.029214 13046 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 08 03:33:47.308492 master-0 kubenswrapper[13046]: I0308 03:33:47.308356 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-9dc9r_fe641585-1cd1-4ba3-aad7-cd111f7e6b6a/vg-manager/0.log" Mar 08 03:33:47.308492 master-0 kubenswrapper[13046]: I0308 03:33:47.308417 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9dc9r" event={"ID":"fe641585-1cd1-4ba3-aad7-cd111f7e6b6a","Type":"ContainerStarted","Data":"783c5363a53ed4afc13ece402d3d0bb2ea0e6184c485f5bbbd86c9214195a098"} Mar 08 03:33:49.885053 master-0 kubenswrapper[13046]: I0308 03:33:49.884982 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7ffq4"] Mar 08 03:33:49.886064 master-0 kubenswrapper[13046]: I0308 03:33:49.886036 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:33:49.888992 master-0 kubenswrapper[13046]: I0308 03:33:49.888941 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 03:33:49.892789 master-0 kubenswrapper[13046]: I0308 03:33:49.892755 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 03:33:49.907484 master-0 kubenswrapper[13046]: I0308 03:33:49.907424 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7ffq4"] Mar 08 03:33:50.054025 master-0 kubenswrapper[13046]: I0308 03:33:50.053884 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr9nj\" (UniqueName: \"kubernetes.io/projected/fd57882e-68dd-4d2b-836d-441589dc7705-kube-api-access-qr9nj\") pod \"openstack-operator-index-7ffq4\" (UID: \"fd57882e-68dd-4d2b-836d-441589dc7705\") " pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:33:50.159772 master-0 kubenswrapper[13046]: I0308 03:33:50.159518 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9nj\" (UniqueName: \"kubernetes.io/projected/fd57882e-68dd-4d2b-836d-441589dc7705-kube-api-access-qr9nj\") pod \"openstack-operator-index-7ffq4\" (UID: \"fd57882e-68dd-4d2b-836d-441589dc7705\") " pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:33:50.178538 master-0 kubenswrapper[13046]: I0308 03:33:50.178413 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9nj\" (UniqueName: \"kubernetes.io/projected/fd57882e-68dd-4d2b-836d-441589dc7705-kube-api-access-qr9nj\") pod \"openstack-operator-index-7ffq4\" (UID: \"fd57882e-68dd-4d2b-836d-441589dc7705\") " pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:33:50.200011 master-0 kubenswrapper[13046]: I0308 03:33:50.199955 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:33:50.624148 master-0 kubenswrapper[13046]: I0308 03:33:50.624043 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7ffq4"] Mar 08 03:33:51.354417 master-0 kubenswrapper[13046]: I0308 03:33:51.354237 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7ffq4" event={"ID":"fd57882e-68dd-4d2b-836d-441589dc7705","Type":"ContainerStarted","Data":"6143fd515ec03763b57dd306e8a4813d84f172b86237e2f3a1e6ac49a502a69a"} Mar 08 03:33:52.371264 master-0 kubenswrapper[13046]: I0308 03:33:52.371158 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7ffq4" event={"ID":"fd57882e-68dd-4d2b-836d-441589dc7705","Type":"ContainerStarted","Data":"20921a33d159c123ff1df41de87dc36d90458158c19c2ff8e7139ad7c48293d1"} Mar 08 03:33:52.405659 master-0 kubenswrapper[13046]: I0308 03:33:52.402772 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7ffq4" podStartSLOduration=2.602715297 podStartE2EDuration="3.402750589s" podCreationTimestamp="2026-03-08 03:33:49 +0000 UTC" firstStartedPulling="2026-03-08 03:33:50.640538398 +0000 UTC m=+1232.719305615" lastFinishedPulling="2026-03-08 03:33:51.44057368 +0000 UTC m=+1233.519340907" observedRunningTime="2026-03-08 03:33:52.395317548 +0000 UTC m=+1234.474084795" watchObservedRunningTime="2026-03-08 03:33:52.402750589 +0000 UTC m=+1234.481517816" Mar 08 03:33:53.445948 master-0 kubenswrapper[13046]: I0308 03:33:53.445883 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:53.448636 master-0 kubenswrapper[13046]: I0308 03:33:53.448588 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:54.243589 master-0 kubenswrapper[13046]: I0308 03:33:54.243476 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-86bc7f4f4f-tzjk6" podUID="d9daabd8-0156-4337-b6d5-3eb664bf8663" containerName="console" containerID="cri-o://37c1f48e626b236b64b19ec0a2dc05cfbb955ce39684dc039c90d4354c0a1689" gracePeriod=15 Mar 08 03:33:54.397238 master-0 kubenswrapper[13046]: I0308 03:33:54.397185 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bc7f4f4f-tzjk6_d9daabd8-0156-4337-b6d5-3eb664bf8663/console/0.log" Mar 08 03:33:54.397416 master-0 kubenswrapper[13046]: I0308 03:33:54.397247 13046 generic.go:334] "Generic (PLEG): container finished" podID="d9daabd8-0156-4337-b6d5-3eb664bf8663" containerID="37c1f48e626b236b64b19ec0a2dc05cfbb955ce39684dc039c90d4354c0a1689" exitCode=2 Mar 08 03:33:54.397416 master-0 kubenswrapper[13046]: I0308 03:33:54.397337 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bc7f4f4f-tzjk6" event={"ID":"d9daabd8-0156-4337-b6d5-3eb664bf8663","Type":"ContainerDied","Data":"37c1f48e626b236b64b19ec0a2dc05cfbb955ce39684dc039c90d4354c0a1689"} Mar 08 03:33:54.397601 master-0 kubenswrapper[13046]: I0308 03:33:54.397567 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:54.399084 master-0 kubenswrapper[13046]: I0308 03:33:54.399018 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-9dc9r" Mar 08 03:33:54.783640 master-0 kubenswrapper[13046]: I0308 03:33:54.783521 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bc7f4f4f-tzjk6_d9daabd8-0156-4337-b6d5-3eb664bf8663/console/0.log" Mar 08 03:33:54.783640 master-0 kubenswrapper[13046]: I0308 03:33:54.783629 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:33:54.954678 master-0 kubenswrapper[13046]: I0308 03:33:54.954607 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.954874 master-0 kubenswrapper[13046]: I0308 03:33:54.954693 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.954874 master-0 kubenswrapper[13046]: I0308 03:33:54.954772 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.954874 master-0 kubenswrapper[13046]: I0308 03:33:54.954832 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955309 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca" (OuterVolumeSpecName: "service-ca") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955390 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config" (OuterVolumeSpecName: "console-config") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955513 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955627 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955757 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6sdt\" (UniqueName: \"kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt\") pod \"d9daabd8-0156-4337-b6d5-3eb664bf8663\" (UID: \"d9daabd8-0156-4337-b6d5-3eb664bf8663\") " Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.955646 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.956338 13046 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.956351 13046 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.956361 13046 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:54.956412 master-0 kubenswrapper[13046]: I0308 03:33:54.956345 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:33:54.959761 master-0 kubenswrapper[13046]: I0308 03:33:54.959735 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:33:54.959869 master-0 kubenswrapper[13046]: I0308 03:33:54.959818 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt" (OuterVolumeSpecName: "kube-api-access-x6sdt") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "kube-api-access-x6sdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:33:54.964102 master-0 kubenswrapper[13046]: I0308 03:33:54.963678 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d9daabd8-0156-4337-b6d5-3eb664bf8663" (UID: "d9daabd8-0156-4337-b6d5-3eb664bf8663"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:33:55.059099 master-0 kubenswrapper[13046]: I0308 03:33:55.058963 13046 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:55.059099 master-0 kubenswrapper[13046]: I0308 03:33:55.059028 13046 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9daabd8-0156-4337-b6d5-3eb664bf8663-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:55.059099 master-0 kubenswrapper[13046]: I0308 03:33:55.059049 13046 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d9daabd8-0156-4337-b6d5-3eb664bf8663-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:55.059099 master-0 kubenswrapper[13046]: I0308 03:33:55.059068 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6sdt\" (UniqueName: \"kubernetes.io/projected/d9daabd8-0156-4337-b6d5-3eb664bf8663-kube-api-access-x6sdt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:33:55.408965 master-0 kubenswrapper[13046]: I0308 03:33:55.408907 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86bc7f4f4f-tzjk6_d9daabd8-0156-4337-b6d5-3eb664bf8663/console/0.log" Mar 08 03:33:55.410063 master-0 kubenswrapper[13046]: I0308 03:33:55.410027 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86bc7f4f4f-tzjk6" Mar 08 03:33:55.424745 master-0 kubenswrapper[13046]: I0308 03:33:55.424655 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86bc7f4f4f-tzjk6" event={"ID":"d9daabd8-0156-4337-b6d5-3eb664bf8663","Type":"ContainerDied","Data":"c5dffead47a409534ba1ccb9f71586e10015d1d905f119eedcac49cde347e78d"} Mar 08 03:33:55.424937 master-0 kubenswrapper[13046]: I0308 03:33:55.424788 13046 scope.go:117] "RemoveContainer" containerID="37c1f48e626b236b64b19ec0a2dc05cfbb955ce39684dc039c90d4354c0a1689" Mar 08 03:33:55.575514 master-0 kubenswrapper[13046]: I0308 03:33:55.574563 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:33:55.582515 master-0 kubenswrapper[13046]: I0308 03:33:55.580815 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86bc7f4f4f-tzjk6"] Mar 08 03:33:56.131824 master-0 kubenswrapper[13046]: I0308 03:33:56.131732 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9daabd8-0156-4337-b6d5-3eb664bf8663" path="/var/lib/kubelet/pods/d9daabd8-0156-4337-b6d5-3eb664bf8663/volumes" Mar 08 03:34:00.201681 master-0 kubenswrapper[13046]: I0308 03:34:00.201587 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:34:00.201681 master-0 kubenswrapper[13046]: I0308 03:34:00.201688 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:34:00.252086 master-0 kubenswrapper[13046]: I0308 03:34:00.252006 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:34:00.545173 master-0 kubenswrapper[13046]: I0308 03:34:00.545020 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7ffq4" Mar 08 03:34:07.630048 master-0 kubenswrapper[13046]: I0308 03:34:07.629957 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7"] Mar 08 03:34:07.631093 master-0 kubenswrapper[13046]: E0308 03:34:07.630643 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9daabd8-0156-4337-b6d5-3eb664bf8663" containerName="console" Mar 08 03:34:07.631093 master-0 kubenswrapper[13046]: I0308 03:34:07.630669 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9daabd8-0156-4337-b6d5-3eb664bf8663" containerName="console" Mar 08 03:34:07.631093 master-0 kubenswrapper[13046]: I0308 03:34:07.631046 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9daabd8-0156-4337-b6d5-3eb664bf8663" containerName="console" Mar 08 03:34:07.633116 master-0 kubenswrapper[13046]: I0308 03:34:07.633076 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.665198 master-0 kubenswrapper[13046]: I0308 03:34:07.665134 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7"] Mar 08 03:34:07.712070 master-0 kubenswrapper[13046]: I0308 03:34:07.711983 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.712280 master-0 kubenswrapper[13046]: I0308 03:34:07.712081 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.712280 master-0 kubenswrapper[13046]: I0308 03:34:07.712244 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kpt\" (UniqueName: \"kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.814853 master-0 kubenswrapper[13046]: I0308 03:34:07.814785 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.815246 master-0 kubenswrapper[13046]: I0308 03:34:07.815209 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.815599 master-0 kubenswrapper[13046]: I0308 03:34:07.815564 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kpt\" (UniqueName: \"kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.815889 master-0 kubenswrapper[13046]: I0308 03:34:07.815827 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.816099 master-0 kubenswrapper[13046]: I0308 03:34:07.816034 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.853285 master-0 kubenswrapper[13046]: I0308 03:34:07.837687 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kpt\" (UniqueName: \"kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:07.959506 master-0 kubenswrapper[13046]: I0308 03:34:07.959258 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:08.432642 master-0 kubenswrapper[13046]: I0308 03:34:08.429768 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7"] Mar 08 03:34:08.580010 master-0 kubenswrapper[13046]: I0308 03:34:08.579907 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerStarted","Data":"f2b060d97df48c12eaecf6df15950229527766216f35d6484a08f70a31a7fe50"} Mar 08 03:34:09.599181 master-0 kubenswrapper[13046]: I0308 03:34:09.599126 13046 generic.go:334] "Generic (PLEG): container finished" podID="6b60fab9-7af7-4945-be33-d495f643467c" containerID="35bb308e960a9501ed2489b7e7b0acd3c43877b8836092bb118b3c894dc28c29" exitCode=0 Mar 08 03:34:09.599181 master-0 kubenswrapper[13046]: I0308 03:34:09.599184 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerDied","Data":"35bb308e960a9501ed2489b7e7b0acd3c43877b8836092bb118b3c894dc28c29"} Mar 08 03:34:10.616135 master-0 kubenswrapper[13046]: I0308 03:34:10.616052 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerStarted","Data":"322078d47fd8bc9b4b9f19fb17e3f368150937f28886767aa32b5c930afd2f3e"} Mar 08 03:34:11.630902 master-0 kubenswrapper[13046]: I0308 03:34:11.630824 13046 generic.go:334] "Generic (PLEG): container finished" podID="6b60fab9-7af7-4945-be33-d495f643467c" containerID="322078d47fd8bc9b4b9f19fb17e3f368150937f28886767aa32b5c930afd2f3e" exitCode=0 Mar 08 03:34:11.630902 master-0 kubenswrapper[13046]: I0308 03:34:11.630882 13046 generic.go:334] "Generic (PLEG): container finished" podID="6b60fab9-7af7-4945-be33-d495f643467c" containerID="c2cf1d20888d9b267d12fb3e1f0c1ae5ab951ccef5eca1aaefa3d69cf8ce5e5f" exitCode=0 Mar 08 03:34:11.630902 master-0 kubenswrapper[13046]: I0308 03:34:11.630892 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerDied","Data":"322078d47fd8bc9b4b9f19fb17e3f368150937f28886767aa32b5c930afd2f3e"} Mar 08 03:34:11.631832 master-0 kubenswrapper[13046]: I0308 03:34:11.630956 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerDied","Data":"c2cf1d20888d9b267d12fb3e1f0c1ae5ab951ccef5eca1aaefa3d69cf8ce5e5f"} Mar 08 03:34:13.075213 master-0 kubenswrapper[13046]: I0308 03:34:13.074886 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:13.122695 master-0 kubenswrapper[13046]: I0308 03:34:13.122607 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle\") pod \"6b60fab9-7af7-4945-be33-d495f643467c\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " Mar 08 03:34:13.122954 master-0 kubenswrapper[13046]: I0308 03:34:13.122856 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util\") pod \"6b60fab9-7af7-4945-be33-d495f643467c\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " Mar 08 03:34:13.123033 master-0 kubenswrapper[13046]: I0308 03:34:13.123017 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6kpt\" (UniqueName: \"kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt\") pod \"6b60fab9-7af7-4945-be33-d495f643467c\" (UID: \"6b60fab9-7af7-4945-be33-d495f643467c\") " Mar 08 03:34:13.126597 master-0 kubenswrapper[13046]: I0308 03:34:13.126539 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt" (OuterVolumeSpecName: "kube-api-access-k6kpt") pod "6b60fab9-7af7-4945-be33-d495f643467c" (UID: "6b60fab9-7af7-4945-be33-d495f643467c"). InnerVolumeSpecName "kube-api-access-k6kpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:34:13.127617 master-0 kubenswrapper[13046]: I0308 03:34:13.127567 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle" (OuterVolumeSpecName: "bundle") pod "6b60fab9-7af7-4945-be33-d495f643467c" (UID: "6b60fab9-7af7-4945-be33-d495f643467c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:34:13.154779 master-0 kubenswrapper[13046]: I0308 03:34:13.154720 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util" (OuterVolumeSpecName: "util") pod "6b60fab9-7af7-4945-be33-d495f643467c" (UID: "6b60fab9-7af7-4945-be33-d495f643467c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:34:13.224941 master-0 kubenswrapper[13046]: I0308 03:34:13.224863 13046 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-util\") on node \"master-0\" DevicePath \"\"" Mar 08 03:34:13.224941 master-0 kubenswrapper[13046]: I0308 03:34:13.224910 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6kpt\" (UniqueName: \"kubernetes.io/projected/6b60fab9-7af7-4945-be33-d495f643467c-kube-api-access-k6kpt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:34:13.224941 master-0 kubenswrapper[13046]: I0308 03:34:13.224936 13046 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b60fab9-7af7-4945-be33-d495f643467c-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:34:13.654463 master-0 kubenswrapper[13046]: I0308 03:34:13.654384 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" event={"ID":"6b60fab9-7af7-4945-be33-d495f643467c","Type":"ContainerDied","Data":"f2b060d97df48c12eaecf6df15950229527766216f35d6484a08f70a31a7fe50"} Mar 08 03:34:13.654739 master-0 kubenswrapper[13046]: I0308 03:34:13.654470 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b060d97df48c12eaecf6df15950229527766216f35d6484a08f70a31a7fe50" Mar 08 03:34:13.654739 master-0 kubenswrapper[13046]: I0308 03:34:13.654478 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: I0308 03:34:25.187649 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr"] Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: E0308 03:34:25.188544 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="util" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: I0308 03:34:25.188563 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="util" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: E0308 03:34:25.188622 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="pull" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: I0308 03:34:25.188635 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="pull" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: E0308 03:34:25.188694 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="extract" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: I0308 03:34:25.188703 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="extract" Mar 08 03:34:25.189475 master-0 kubenswrapper[13046]: I0308 03:34:25.189114 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b60fab9-7af7-4945-be33-d495f643467c" containerName="extract" Mar 08 03:34:25.190414 master-0 kubenswrapper[13046]: I0308 03:34:25.190385 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:25.243117 master-0 kubenswrapper[13046]: I0308 03:34:25.243059 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr"] Mar 08 03:34:25.272058 master-0 kubenswrapper[13046]: I0308 03:34:25.271977 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhqjr\" (UniqueName: \"kubernetes.io/projected/4fbbe52c-ca98-4397-8f89-f75c4b5dce52-kube-api-access-zhqjr\") pod \"openstack-operator-controller-init-6f44f7b99f-6dgdr\" (UID: \"4fbbe52c-ca98-4397-8f89-f75c4b5dce52\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:25.373998 master-0 kubenswrapper[13046]: I0308 03:34:25.373914 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhqjr\" (UniqueName: \"kubernetes.io/projected/4fbbe52c-ca98-4397-8f89-f75c4b5dce52-kube-api-access-zhqjr\") pod \"openstack-operator-controller-init-6f44f7b99f-6dgdr\" (UID: \"4fbbe52c-ca98-4397-8f89-f75c4b5dce52\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:25.394414 master-0 kubenswrapper[13046]: I0308 03:34:25.394360 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhqjr\" (UniqueName: \"kubernetes.io/projected/4fbbe52c-ca98-4397-8f89-f75c4b5dce52-kube-api-access-zhqjr\") pod \"openstack-operator-controller-init-6f44f7b99f-6dgdr\" (UID: \"4fbbe52c-ca98-4397-8f89-f75c4b5dce52\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:25.557990 master-0 kubenswrapper[13046]: I0308 03:34:25.557888 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:26.000449 master-0 kubenswrapper[13046]: W0308 03:34:26.000380 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbbe52c_ca98_4397_8f89_f75c4b5dce52.slice/crio-e7b6a7902caecf7008e9a2851ed9c25acb563b8b58a92876f2797a57761ddb73 WatchSource:0}: Error finding container e7b6a7902caecf7008e9a2851ed9c25acb563b8b58a92876f2797a57761ddb73: Status 404 returned error can't find the container with id e7b6a7902caecf7008e9a2851ed9c25acb563b8b58a92876f2797a57761ddb73 Mar 08 03:34:26.003926 master-0 kubenswrapper[13046]: I0308 03:34:26.003856 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr"] Mar 08 03:34:26.810109 master-0 kubenswrapper[13046]: I0308 03:34:26.805569 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" event={"ID":"4fbbe52c-ca98-4397-8f89-f75c4b5dce52","Type":"ContainerStarted","Data":"e7b6a7902caecf7008e9a2851ed9c25acb563b8b58a92876f2797a57761ddb73"} Mar 08 03:34:30.839975 master-0 kubenswrapper[13046]: I0308 03:34:30.839935 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" event={"ID":"4fbbe52c-ca98-4397-8f89-f75c4b5dce52","Type":"ContainerStarted","Data":"cca1bd78ccfad03fe3ce2c052f224474614bc88c1ce4ccf6b596967765ff0f52"} Mar 08 03:34:30.840572 master-0 kubenswrapper[13046]: I0308 03:34:30.840554 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:30.887415 master-0 kubenswrapper[13046]: I0308 03:34:30.887319 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" podStartSLOduration=1.5237227039999999 podStartE2EDuration="5.887298555s" podCreationTimestamp="2026-03-08 03:34:25 +0000 UTC" firstStartedPulling="2026-03-08 03:34:26.00320979 +0000 UTC m=+1268.081977017" lastFinishedPulling="2026-03-08 03:34:30.366785651 +0000 UTC m=+1272.445552868" observedRunningTime="2026-03-08 03:34:30.880149252 +0000 UTC m=+1272.958916479" watchObservedRunningTime="2026-03-08 03:34:30.887298555 +0000 UTC m=+1272.966065772" Mar 08 03:34:35.565115 master-0 kubenswrapper[13046]: I0308 03:34:35.565040 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-6dgdr" Mar 08 03:34:56.237390 master-0 kubenswrapper[13046]: I0308 03:34:56.236674 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj"] Mar 08 03:34:56.238062 master-0 kubenswrapper[13046]: I0308 03:34:56.237966 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:34:56.251409 master-0 kubenswrapper[13046]: I0308 03:34:56.251346 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm"] Mar 08 03:34:56.252708 master-0 kubenswrapper[13046]: I0308 03:34:56.252641 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:34:56.260631 master-0 kubenswrapper[13046]: I0308 03:34:56.260557 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm"] Mar 08 03:34:56.266657 master-0 kubenswrapper[13046]: I0308 03:34:56.266459 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj"] Mar 08 03:34:56.326667 master-0 kubenswrapper[13046]: I0308 03:34:56.323499 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bss\" (UniqueName: \"kubernetes.io/projected/494aa38f-eb0c-4f1a-9789-2c96aa18460c-kube-api-access-k7bss\") pod \"barbican-operator-controller-manager-6db6876945-dqzpj\" (UID: \"494aa38f-eb0c-4f1a-9789-2c96aa18460c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:34:56.416637 master-0 kubenswrapper[13046]: I0308 03:34:56.416576 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5"] Mar 08 03:34:56.419026 master-0 kubenswrapper[13046]: I0308 03:34:56.418878 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:34:56.430529 master-0 kubenswrapper[13046]: I0308 03:34:56.428419 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48sb\" (UniqueName: \"kubernetes.io/projected/598125df-0908-4698-ac00-349bc90e6f9d-kube-api-access-n48sb\") pod \"cinder-operator-controller-manager-55d77d7b5c-lgvbm\" (UID: \"598125df-0908-4698-ac00-349bc90e6f9d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:34:56.430529 master-0 kubenswrapper[13046]: I0308 03:34:56.428548 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bss\" (UniqueName: \"kubernetes.io/projected/494aa38f-eb0c-4f1a-9789-2c96aa18460c-kube-api-access-k7bss\") pod \"barbican-operator-controller-manager-6db6876945-dqzpj\" (UID: \"494aa38f-eb0c-4f1a-9789-2c96aa18460c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:34:56.430529 master-0 kubenswrapper[13046]: I0308 03:34:56.430127 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr"] Mar 08 03:34:56.488146 master-0 kubenswrapper[13046]: I0308 03:34:56.488022 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5"] Mar 08 03:34:56.488146 master-0 kubenswrapper[13046]: I0308 03:34:56.488141 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:34:56.488799 master-0 kubenswrapper[13046]: I0308 03:34:56.488755 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bss\" (UniqueName: \"kubernetes.io/projected/494aa38f-eb0c-4f1a-9789-2c96aa18460c-kube-api-access-k7bss\") pod \"barbican-operator-controller-manager-6db6876945-dqzpj\" (UID: \"494aa38f-eb0c-4f1a-9789-2c96aa18460c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:34:56.509828 master-0 kubenswrapper[13046]: I0308 03:34:56.504992 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr"] Mar 08 03:34:56.531552 master-0 kubenswrapper[13046]: I0308 03:34:56.529510 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n48sb\" (UniqueName: \"kubernetes.io/projected/598125df-0908-4698-ac00-349bc90e6f9d-kube-api-access-n48sb\") pod \"cinder-operator-controller-manager-55d77d7b5c-lgvbm\" (UID: \"598125df-0908-4698-ac00-349bc90e6f9d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:34:56.531552 master-0 kubenswrapper[13046]: I0308 03:34:56.529676 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkfh\" (UniqueName: \"kubernetes.io/projected/a346c90a-64e5-48d5-a428-77012a677ea6-kube-api-access-pnkfh\") pod \"designate-operator-controller-manager-5d87c9d997-kjfz5\" (UID: \"a346c90a-64e5-48d5-a428-77012a677ea6\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:34:56.531552 master-0 kubenswrapper[13046]: I0308 03:34:56.529992 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5"] Mar 08 03:34:56.531552 master-0 kubenswrapper[13046]: I0308 03:34:56.530974 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:34:56.537104 master-0 kubenswrapper[13046]: I0308 03:34:56.536754 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g"] Mar 08 03:34:56.538821 master-0 kubenswrapper[13046]: I0308 03:34:56.538665 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:34:56.547475 master-0 kubenswrapper[13046]: I0308 03:34:56.545701 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5"] Mar 08 03:34:56.583083 master-0 kubenswrapper[13046]: I0308 03:34:56.563698 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48sb\" (UniqueName: \"kubernetes.io/projected/598125df-0908-4698-ac00-349bc90e6f9d-kube-api-access-n48sb\") pod \"cinder-operator-controller-manager-55d77d7b5c-lgvbm\" (UID: \"598125df-0908-4698-ac00-349bc90e6f9d\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:34:56.583083 master-0 kubenswrapper[13046]: I0308 03:34:56.577745 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g"] Mar 08 03:34:56.629318 master-0 kubenswrapper[13046]: I0308 03:34:56.628992 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-99265"] Mar 08 03:34:56.634342 master-0 kubenswrapper[13046]: I0308 03:34:56.633902 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkfh\" (UniqueName: \"kubernetes.io/projected/a346c90a-64e5-48d5-a428-77012a677ea6-kube-api-access-pnkfh\") pod \"designate-operator-controller-manager-5d87c9d997-kjfz5\" (UID: \"a346c90a-64e5-48d5-a428-77012a677ea6\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:34:56.634342 master-0 kubenswrapper[13046]: I0308 03:34:56.634070 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvn7\" (UniqueName: \"kubernetes.io/projected/fde9c4f2-1850-4139-9f22-73a12e2e66f8-kube-api-access-zlvn7\") pod \"glance-operator-controller-manager-64db6967f8-h8sbr\" (UID: \"fde9c4f2-1850-4139-9f22-73a12e2e66f8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:34:56.634342 master-0 kubenswrapper[13046]: I0308 03:34:56.634168 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkcmw\" (UniqueName: \"kubernetes.io/projected/9a9e2b01-2399-4ea4-9eea-52d8e7050649-kube-api-access-gkcmw\") pod \"heat-operator-controller-manager-cf99c678f-d6sb5\" (UID: \"9a9e2b01-2399-4ea4-9eea-52d8e7050649\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:34:56.634342 master-0 kubenswrapper[13046]: I0308 03:34:56.634227 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9lv\" (UniqueName: \"kubernetes.io/projected/5f728375-b21a-4e8b-8c90-bd501c82d6b2-kube-api-access-jp9lv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-plz2g\" (UID: \"5f728375-b21a-4e8b-8c90-bd501c82d6b2\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:34:56.635295 master-0 kubenswrapper[13046]: I0308 03:34:56.635269 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.645122 master-0 kubenswrapper[13046]: I0308 03:34:56.644905 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 03:34:56.662624 master-0 kubenswrapper[13046]: I0308 03:34:56.659883 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:34:56.662624 master-0 kubenswrapper[13046]: I0308 03:34:56.661218 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-99265"] Mar 08 03:34:56.685317 master-0 kubenswrapper[13046]: I0308 03:34:56.685277 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:34:56.692511 master-0 kubenswrapper[13046]: I0308 03:34:56.691255 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkfh\" (UniqueName: \"kubernetes.io/projected/a346c90a-64e5-48d5-a428-77012a677ea6-kube-api-access-pnkfh\") pod \"designate-operator-controller-manager-5d87c9d997-kjfz5\" (UID: \"a346c90a-64e5-48d5-a428-77012a677ea6\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:34:56.704532 master-0 kubenswrapper[13046]: I0308 03:34:56.701424 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs"] Mar 08 03:34:56.704532 master-0 kubenswrapper[13046]: I0308 03:34:56.703131 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:34:56.736838 master-0 kubenswrapper[13046]: I0308 03:34:56.735982 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.736838 master-0 kubenswrapper[13046]: I0308 03:34:56.736078 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvn7\" (UniqueName: \"kubernetes.io/projected/fde9c4f2-1850-4139-9f22-73a12e2e66f8-kube-api-access-zlvn7\") pod \"glance-operator-controller-manager-64db6967f8-h8sbr\" (UID: \"fde9c4f2-1850-4139-9f22-73a12e2e66f8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:34:56.736838 master-0 kubenswrapper[13046]: I0308 03:34:56.736125 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkcmw\" (UniqueName: \"kubernetes.io/projected/9a9e2b01-2399-4ea4-9eea-52d8e7050649-kube-api-access-gkcmw\") pod \"heat-operator-controller-manager-cf99c678f-d6sb5\" (UID: \"9a9e2b01-2399-4ea4-9eea-52d8e7050649\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:34:56.736838 master-0 kubenswrapper[13046]: I0308 03:34:56.736144 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znhmb\" (UniqueName: \"kubernetes.io/projected/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-kube-api-access-znhmb\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.736838 master-0 kubenswrapper[13046]: I0308 03:34:56.736176 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9lv\" (UniqueName: \"kubernetes.io/projected/5f728375-b21a-4e8b-8c90-bd501c82d6b2-kube-api-access-jp9lv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-plz2g\" (UID: \"5f728375-b21a-4e8b-8c90-bd501c82d6b2\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:34:56.782546 master-0 kubenswrapper[13046]: I0308 03:34:56.777632 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs"] Mar 08 03:34:56.782546 master-0 kubenswrapper[13046]: I0308 03:34:56.780010 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9lv\" (UniqueName: \"kubernetes.io/projected/5f728375-b21a-4e8b-8c90-bd501c82d6b2-kube-api-access-jp9lv\") pod \"horizon-operator-controller-manager-78bc7f9bd9-plz2g\" (UID: \"5f728375-b21a-4e8b-8c90-bd501c82d6b2\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:34:56.786177 master-0 kubenswrapper[13046]: I0308 03:34:56.784838 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkcmw\" (UniqueName: \"kubernetes.io/projected/9a9e2b01-2399-4ea4-9eea-52d8e7050649-kube-api-access-gkcmw\") pod \"heat-operator-controller-manager-cf99c678f-d6sb5\" (UID: \"9a9e2b01-2399-4ea4-9eea-52d8e7050649\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:34:56.788608 master-0 kubenswrapper[13046]: I0308 03:34:56.788564 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9"] Mar 08 03:34:56.789769 master-0 kubenswrapper[13046]: I0308 03:34:56.789742 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:34:56.796302 master-0 kubenswrapper[13046]: I0308 03:34:56.790572 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:34:56.822912 master-0 kubenswrapper[13046]: I0308 03:34:56.820394 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvn7\" (UniqueName: \"kubernetes.io/projected/fde9c4f2-1850-4139-9f22-73a12e2e66f8-kube-api-access-zlvn7\") pod \"glance-operator-controller-manager-64db6967f8-h8sbr\" (UID: \"fde9c4f2-1850-4139-9f22-73a12e2e66f8\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: I0308 03:34:56.837260 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znhmb\" (UniqueName: \"kubernetes.io/projected/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-kube-api-access-znhmb\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: I0308 03:34:56.837350 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvf9\" (UniqueName: \"kubernetes.io/projected/22e4bfb2-d3d5-420e-b79e-d36ead75b302-kube-api-access-rgvf9\") pod \"keystone-operator-controller-manager-7c789f89c6-v2kb9\" (UID: \"22e4bfb2-d3d5-420e-b79e-d36ead75b302\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: I0308 03:34:56.837827 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: I0308 03:34:56.838040 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvs6h\" (UniqueName: \"kubernetes.io/projected/5adb5035-1d71-4fc4-8174-b47b3f367be2-kube-api-access-fvs6h\") pod \"ironic-operator-controller-manager-545456dc4-v9kvs\" (UID: \"5adb5035-1d71-4fc4-8174-b47b3f367be2\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: E0308 03:34:56.838251 13046 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: E0308 03:34:56.842368 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert podName:17f3dac6-f1b5-4396-abe1-c2ef1e3a321e nodeName:}" failed. No retries permitted until 2026-03-08 03:34:57.342342654 +0000 UTC m=+1299.421109871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert") pod "infra-operator-controller-manager-65b58d74b-99265" (UID: "17f3dac6-f1b5-4396-abe1-c2ef1e3a321e") : secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:56.845502 master-0 kubenswrapper[13046]: I0308 03:34:56.842890 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9"] Mar 08 03:34:56.847776 master-0 kubenswrapper[13046]: I0308 03:34:56.847737 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:34:56.874684 master-0 kubenswrapper[13046]: I0308 03:34:56.874646 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znhmb\" (UniqueName: \"kubernetes.io/projected/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-kube-api-access-znhmb\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:56.910563 master-0 kubenswrapper[13046]: I0308 03:34:56.908701 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp"] Mar 08 03:34:56.910563 master-0 kubenswrapper[13046]: I0308 03:34:56.910260 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:34:56.914924 master-0 kubenswrapper[13046]: I0308 03:34:56.914397 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:34:56.941441 master-0 kubenswrapper[13046]: I0308 03:34:56.941403 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvs6h\" (UniqueName: \"kubernetes.io/projected/5adb5035-1d71-4fc4-8174-b47b3f367be2-kube-api-access-fvs6h\") pod \"ironic-operator-controller-manager-545456dc4-v9kvs\" (UID: \"5adb5035-1d71-4fc4-8174-b47b3f367be2\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:34:56.942511 master-0 kubenswrapper[13046]: I0308 03:34:56.941880 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvf9\" (UniqueName: \"kubernetes.io/projected/22e4bfb2-d3d5-420e-b79e-d36ead75b302-kube-api-access-rgvf9\") pod \"keystone-operator-controller-manager-7c789f89c6-v2kb9\" (UID: \"22e4bfb2-d3d5-420e-b79e-d36ead75b302\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:34:56.945558 master-0 kubenswrapper[13046]: I0308 03:34:56.944837 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp"] Mar 08 03:34:56.963544 master-0 kubenswrapper[13046]: I0308 03:34:56.962246 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvs6h\" (UniqueName: \"kubernetes.io/projected/5adb5035-1d71-4fc4-8174-b47b3f367be2-kube-api-access-fvs6h\") pod \"ironic-operator-controller-manager-545456dc4-v9kvs\" (UID: \"5adb5035-1d71-4fc4-8174-b47b3f367be2\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:34:56.965436 master-0 kubenswrapper[13046]: I0308 03:34:56.965388 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvf9\" (UniqueName: \"kubernetes.io/projected/22e4bfb2-d3d5-420e-b79e-d36ead75b302-kube-api-access-rgvf9\") pod \"keystone-operator-controller-manager-7c789f89c6-v2kb9\" (UID: \"22e4bfb2-d3d5-420e-b79e-d36ead75b302\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:34:56.967797 master-0 kubenswrapper[13046]: I0308 03:34:56.965790 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h"] Mar 08 03:34:56.968325 master-0 kubenswrapper[13046]: I0308 03:34:56.968168 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:34:56.981242 master-0 kubenswrapper[13046]: I0308 03:34:56.980696 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6"] Mar 08 03:34:56.983605 master-0 kubenswrapper[13046]: I0308 03:34:56.983258 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:34:56.990371 master-0 kubenswrapper[13046]: I0308 03:34:56.986364 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:34:57.028879 master-0 kubenswrapper[13046]: I0308 03:34:57.028834 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h"] Mar 08 03:34:57.052661 master-0 kubenswrapper[13046]: I0308 03:34:57.052083 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t882\" (UniqueName: \"kubernetes.io/projected/389e5461-3240-444e-8230-621192f5bc87-kube-api-access-5t882\") pod \"manila-operator-controller-manager-67d996989d-4c8zp\" (UID: \"389e5461-3240-444e-8230-621192f5bc87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:34:57.052661 master-0 kubenswrapper[13046]: I0308 03:34:57.052294 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zrl\" (UniqueName: \"kubernetes.io/projected/42aeff7c-1920-43a9-aaaf-c8f9f7d54752-kube-api-access-r8zrl\") pod \"neutron-operator-controller-manager-54688575f-xf2d6\" (UID: \"42aeff7c-1920-43a9-aaaf-c8f9f7d54752\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:34:57.052661 master-0 kubenswrapper[13046]: I0308 03:34:57.052367 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqklq\" (UniqueName: \"kubernetes.io/projected/c470d5a0-46d5-4a08-bd51-7feddc4beaef-kube-api-access-tqklq\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ggv2h\" (UID: \"c470d5a0-46d5-4a08-bd51-7feddc4beaef\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:34:57.068222 master-0 kubenswrapper[13046]: I0308 03:34:57.060968 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl"] Mar 08 03:34:57.069313 master-0 kubenswrapper[13046]: I0308 03:34:57.069274 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:34:57.134547 master-0 kubenswrapper[13046]: I0308 03:34:57.131863 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.157567 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6"] Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.158686 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.160567 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t882\" (UniqueName: \"kubernetes.io/projected/389e5461-3240-444e-8230-621192f5bc87-kube-api-access-5t882\") pod \"manila-operator-controller-manager-67d996989d-4c8zp\" (UID: \"389e5461-3240-444e-8230-621192f5bc87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.160645 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6dgc\" (UniqueName: \"kubernetes.io/projected/9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3-kube-api-access-d6dgc\") pod \"nova-operator-controller-manager-74b6b5dc96-lsdhl\" (UID: \"9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.160678 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zrl\" (UniqueName: \"kubernetes.io/projected/42aeff7c-1920-43a9-aaaf-c8f9f7d54752-kube-api-access-r8zrl\") pod \"neutron-operator-controller-manager-54688575f-xf2d6\" (UID: \"42aeff7c-1920-43a9-aaaf-c8f9f7d54752\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:34:57.168181 master-0 kubenswrapper[13046]: I0308 03:34:57.160717 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqklq\" (UniqueName: \"kubernetes.io/projected/c470d5a0-46d5-4a08-bd51-7feddc4beaef-kube-api-access-tqklq\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ggv2h\" (UID: \"c470d5a0-46d5-4a08-bd51-7feddc4beaef\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:34:57.200423 master-0 kubenswrapper[13046]: I0308 03:34:57.200370 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zrl\" (UniqueName: \"kubernetes.io/projected/42aeff7c-1920-43a9-aaaf-c8f9f7d54752-kube-api-access-r8zrl\") pod \"neutron-operator-controller-manager-54688575f-xf2d6\" (UID: \"42aeff7c-1920-43a9-aaaf-c8f9f7d54752\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:34:57.202388 master-0 kubenswrapper[13046]: I0308 03:34:57.201606 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t882\" (UniqueName: \"kubernetes.io/projected/389e5461-3240-444e-8230-621192f5bc87-kube-api-access-5t882\") pod \"manila-operator-controller-manager-67d996989d-4c8zp\" (UID: \"389e5461-3240-444e-8230-621192f5bc87\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:34:57.204902 master-0 kubenswrapper[13046]: I0308 03:34:57.203503 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqklq\" (UniqueName: \"kubernetes.io/projected/c470d5a0-46d5-4a08-bd51-7feddc4beaef-kube-api-access-tqklq\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ggv2h\" (UID: \"c470d5a0-46d5-4a08-bd51-7feddc4beaef\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:34:57.260879 master-0 kubenswrapper[13046]: I0308 03:34:57.260821 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:34:57.263248 master-0 kubenswrapper[13046]: I0308 03:34:57.263198 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6dgc\" (UniqueName: \"kubernetes.io/projected/9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3-kube-api-access-d6dgc\") pod \"nova-operator-controller-manager-74b6b5dc96-lsdhl\" (UID: \"9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:34:57.271331 master-0 kubenswrapper[13046]: I0308 03:34:57.271187 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl"] Mar 08 03:34:57.281600 master-0 kubenswrapper[13046]: I0308 03:34:57.281560 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6dgc\" (UniqueName: \"kubernetes.io/projected/9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3-kube-api-access-d6dgc\") pod \"nova-operator-controller-manager-74b6b5dc96-lsdhl\" (UID: \"9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:34:57.290578 master-0 kubenswrapper[13046]: I0308 03:34:57.290363 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj"] Mar 08 03:34:57.293915 master-0 kubenswrapper[13046]: I0308 03:34:57.293882 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:34:57.331205 master-0 kubenswrapper[13046]: I0308 03:34:57.331112 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj"] Mar 08 03:34:57.337788 master-0 kubenswrapper[13046]: I0308 03:34:57.337730 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:34:57.343169 master-0 kubenswrapper[13046]: W0308 03:34:57.340442 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod598125df_0908_4698_ac00_349bc90e6f9d.slice/crio-f25af752705dae7a3efabd677feeae3df5fe2511f588964d3ad2363d5521b954 WatchSource:0}: Error finding container f25af752705dae7a3efabd677feeae3df5fe2511f588964d3ad2363d5521b954: Status 404 returned error can't find the container with id f25af752705dae7a3efabd677feeae3df5fe2511f588964d3ad2363d5521b954 Mar 08 03:34:57.344651 master-0 kubenswrapper[13046]: I0308 03:34:57.344082 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6"] Mar 08 03:34:57.345532 master-0 kubenswrapper[13046]: I0308 03:34:57.345402 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.349534 master-0 kubenswrapper[13046]: I0308 03:34:57.347997 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 03:34:57.353438 master-0 kubenswrapper[13046]: I0308 03:34:57.353352 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9"] Mar 08 03:34:57.356560 master-0 kubenswrapper[13046]: I0308 03:34:57.356520 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:34:57.364220 master-0 kubenswrapper[13046]: I0308 03:34:57.364169 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:57.364364 master-0 kubenswrapper[13046]: I0308 03:34:57.364264 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2csgj\" (UniqueName: \"kubernetes.io/projected/815174b8-3094-4bd1-bc8c-4b47adcfdcea-kube-api-access-2csgj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mxcfj\" (UID: \"815174b8-3094-4bd1-bc8c-4b47adcfdcea\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:34:57.367445 master-0 kubenswrapper[13046]: E0308 03:34:57.366554 13046 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:57.367445 master-0 kubenswrapper[13046]: I0308 03:34:57.367295 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6"] Mar 08 03:34:57.367445 master-0 kubenswrapper[13046]: E0308 03:34:57.367324 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert podName:17f3dac6-f1b5-4396-abe1-c2ef1e3a321e nodeName:}" failed. No retries permitted until 2026-03-08 03:34:58.367303585 +0000 UTC m=+1300.446070802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert") pod "infra-operator-controller-manager-65b58d74b-99265" (UID: "17f3dac6-f1b5-4396-abe1-c2ef1e3a321e") : secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:57.374032 master-0 kubenswrapper[13046]: I0308 03:34:57.373994 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:34:57.389697 master-0 kubenswrapper[13046]: I0308 03:34:57.389639 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9"] Mar 08 03:34:57.400646 master-0 kubenswrapper[13046]: I0308 03:34:57.400335 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t"] Mar 08 03:34:57.405006 master-0 kubenswrapper[13046]: I0308 03:34:57.404829 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:34:57.415769 master-0 kubenswrapper[13046]: I0308 03:34:57.415737 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:34:57.437252 master-0 kubenswrapper[13046]: I0308 03:34:57.437223 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf"] Mar 08 03:34:57.438730 master-0 kubenswrapper[13046]: I0308 03:34:57.438691 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:34:57.467257 master-0 kubenswrapper[13046]: I0308 03:34:57.467221 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.467443 master-0 kubenswrapper[13046]: I0308 03:34:57.467425 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzf4q\" (UniqueName: \"kubernetes.io/projected/1efdb13d-44bc-429f-bc09-cb520504d91c-kube-api-access-mzf4q\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.467590 master-0 kubenswrapper[13046]: I0308 03:34:57.467567 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2csgj\" (UniqueName: \"kubernetes.io/projected/815174b8-3094-4bd1-bc8c-4b47adcfdcea-kube-api-access-2csgj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mxcfj\" (UID: \"815174b8-3094-4bd1-bc8c-4b47adcfdcea\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:34:57.468096 master-0 kubenswrapper[13046]: I0308 03:34:57.468070 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z88c8\" (UniqueName: \"kubernetes.io/projected/7573117d-5c36-4c09-b193-3bf1fbc4c487-kube-api-access-z88c8\") pod \"placement-operator-controller-manager-648564c9fc-qsh7t\" (UID: \"7573117d-5c36-4c09-b193-3bf1fbc4c487\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:34:57.468258 master-0 kubenswrapper[13046]: I0308 03:34:57.468239 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8pk\" (UniqueName: \"kubernetes.io/projected/ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69-kube-api-access-pl8pk\") pod \"ovn-operator-controller-manager-75684d597f-dbrs9\" (UID: \"ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:34:57.471373 master-0 kubenswrapper[13046]: I0308 03:34:57.471311 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t"] Mar 08 03:34:57.497236 master-0 kubenswrapper[13046]: I0308 03:34:57.497162 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf"] Mar 08 03:34:57.520961 master-0 kubenswrapper[13046]: I0308 03:34:57.520562 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv"] Mar 08 03:34:57.521708 master-0 kubenswrapper[13046]: I0308 03:34:57.521680 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:34:57.526508 master-0 kubenswrapper[13046]: I0308 03:34:57.525737 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2csgj\" (UniqueName: \"kubernetes.io/projected/815174b8-3094-4bd1-bc8c-4b47adcfdcea-kube-api-access-2csgj\") pod \"octavia-operator-controller-manager-5d86c7ddb7-mxcfj\" (UID: \"815174b8-3094-4bd1-bc8c-4b47adcfdcea\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:34:57.530662 master-0 kubenswrapper[13046]: I0308 03:34:57.530585 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9"] Mar 08 03:34:57.531787 master-0 kubenswrapper[13046]: I0308 03:34:57.531761 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:34:57.544159 master-0 kubenswrapper[13046]: I0308 03:34:57.540515 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv"] Mar 08 03:34:57.574827 master-0 kubenswrapper[13046]: I0308 03:34:57.574770 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z88c8\" (UniqueName: \"kubernetes.io/projected/7573117d-5c36-4c09-b193-3bf1fbc4c487-kube-api-access-z88c8\") pod \"placement-operator-controller-manager-648564c9fc-qsh7t\" (UID: \"7573117d-5c36-4c09-b193-3bf1fbc4c487\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:34:57.575008 master-0 kubenswrapper[13046]: I0308 03:34:57.574857 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8pk\" (UniqueName: \"kubernetes.io/projected/ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69-kube-api-access-pl8pk\") pod \"ovn-operator-controller-manager-75684d597f-dbrs9\" (UID: \"ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:34:57.575008 master-0 kubenswrapper[13046]: I0308 03:34:57.574926 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.575008 master-0 kubenswrapper[13046]: I0308 03:34:57.574985 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzf4q\" (UniqueName: \"kubernetes.io/projected/1efdb13d-44bc-429f-bc09-cb520504d91c-kube-api-access-mzf4q\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.575106 master-0 kubenswrapper[13046]: I0308 03:34:57.575053 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2km46\" (UniqueName: \"kubernetes.io/projected/e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63-kube-api-access-2km46\") pod \"swift-operator-controller-manager-9b9ff9f4d-ckkmf\" (UID: \"e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:34:57.579568 master-0 kubenswrapper[13046]: E0308 03:34:57.579215 13046 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:57.579568 master-0 kubenswrapper[13046]: E0308 03:34:57.579322 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert podName:1efdb13d-44bc-429f-bc09-cb520504d91c nodeName:}" failed. No retries permitted until 2026-03-08 03:34:58.079292822 +0000 UTC m=+1300.158060039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" (UID: "1efdb13d-44bc-429f-bc09-cb520504d91c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:57.598706 master-0 kubenswrapper[13046]: I0308 03:34:57.598658 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z88c8\" (UniqueName: \"kubernetes.io/projected/7573117d-5c36-4c09-b193-3bf1fbc4c487-kube-api-access-z88c8\") pod \"placement-operator-controller-manager-648564c9fc-qsh7t\" (UID: \"7573117d-5c36-4c09-b193-3bf1fbc4c487\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:34:57.612458 master-0 kubenswrapper[13046]: I0308 03:34:57.601951 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9"] Mar 08 03:34:57.612458 master-0 kubenswrapper[13046]: I0308 03:34:57.606048 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9"] Mar 08 03:34:57.626641 master-0 kubenswrapper[13046]: I0308 03:34:57.625678 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzf4q\" (UniqueName: \"kubernetes.io/projected/1efdb13d-44bc-429f-bc09-cb520504d91c-kube-api-access-mzf4q\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:57.626641 master-0 kubenswrapper[13046]: I0308 03:34:57.626272 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:34:57.632803 master-0 kubenswrapper[13046]: I0308 03:34:57.627346 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:34:57.632803 master-0 kubenswrapper[13046]: I0308 03:34:57.632756 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8pk\" (UniqueName: \"kubernetes.io/projected/ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69-kube-api-access-pl8pk\") pod \"ovn-operator-controller-manager-75684d597f-dbrs9\" (UID: \"ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:34:57.641721 master-0 kubenswrapper[13046]: I0308 03:34:57.640983 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9"] Mar 08 03:34:57.677520 master-0 kubenswrapper[13046]: I0308 03:34:57.677453 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkzf\" (UniqueName: \"kubernetes.io/projected/8705b78b-c460-4314-9951-db06f96f49e3-kube-api-access-4hkzf\") pod \"telemetry-operator-controller-manager-5fdb694969-fcgrv\" (UID: \"8705b78b-c460-4314-9951-db06f96f49e3\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:34:57.677520 master-0 kubenswrapper[13046]: I0308 03:34:57.677508 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2km46\" (UniqueName: \"kubernetes.io/projected/e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63-kube-api-access-2km46\") pod \"swift-operator-controller-manager-9b9ff9f4d-ckkmf\" (UID: \"e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:34:57.677704 master-0 kubenswrapper[13046]: I0308 03:34:57.677541 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdd25\" (UniqueName: \"kubernetes.io/projected/a25b753b-985a-4260-a297-02f3e4e86122-kube-api-access-wdd25\") pod \"test-operator-controller-manager-55b5ff4dbb-lt4h9\" (UID: \"a25b753b-985a-4260-a297-02f3e4e86122\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:34:57.679796 master-0 kubenswrapper[13046]: I0308 03:34:57.679759 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw"] Mar 08 03:34:57.681838 master-0 kubenswrapper[13046]: I0308 03:34:57.681206 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.689284 master-0 kubenswrapper[13046]: I0308 03:34:57.689258 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 03:34:57.691747 master-0 kubenswrapper[13046]: I0308 03:34:57.690978 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 03:34:57.694125 master-0 kubenswrapper[13046]: I0308 03:34:57.694096 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2km46\" (UniqueName: \"kubernetes.io/projected/e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63-kube-api-access-2km46\") pod \"swift-operator-controller-manager-9b9ff9f4d-ckkmf\" (UID: \"e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:34:57.694211 master-0 kubenswrapper[13046]: I0308 03:34:57.694152 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw"] Mar 08 03:34:57.708161 master-0 kubenswrapper[13046]: I0308 03:34:57.708095 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b"] Mar 08 03:34:57.714325 master-0 kubenswrapper[13046]: I0308 03:34:57.713053 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" Mar 08 03:34:57.734844 master-0 kubenswrapper[13046]: I0308 03:34:57.734759 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b"] Mar 08 03:34:57.735070 master-0 kubenswrapper[13046]: I0308 03:34:57.735044 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:34:57.740590 master-0 kubenswrapper[13046]: I0308 03:34:57.740471 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:34:57.778923 master-0 kubenswrapper[13046]: I0308 03:34:57.778861 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.779089 master-0 kubenswrapper[13046]: I0308 03:34:57.779011 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4z4w\" (UniqueName: \"kubernetes.io/projected/a85493ea-63bd-4f74-bee8-e93d882d0991-kube-api-access-l4z4w\") pod \"watcher-operator-controller-manager-bccc79885-lv2t9\" (UID: \"a85493ea-63bd-4f74-bee8-e93d882d0991\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:34:57.779089 master-0 kubenswrapper[13046]: I0308 03:34:57.779041 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6q5q\" (UniqueName: \"kubernetes.io/projected/f77296db-7b56-4306-bcf1-d2cef736f49f-kube-api-access-z6q5q\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.779089 master-0 kubenswrapper[13046]: I0308 03:34:57.779076 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.779202 master-0 kubenswrapper[13046]: I0308 03:34:57.779102 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nnjn\" (UniqueName: \"kubernetes.io/projected/2cc8c661-4f55-495d-92bf-9075a5ecf8ab-kube-api-access-5nnjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fg92b\" (UID: \"2cc8c661-4f55-495d-92bf-9075a5ecf8ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" Mar 08 03:34:57.779202 master-0 kubenswrapper[13046]: I0308 03:34:57.779155 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkzf\" (UniqueName: \"kubernetes.io/projected/8705b78b-c460-4314-9951-db06f96f49e3-kube-api-access-4hkzf\") pod \"telemetry-operator-controller-manager-5fdb694969-fcgrv\" (UID: \"8705b78b-c460-4314-9951-db06f96f49e3\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:34:57.779202 master-0 kubenswrapper[13046]: I0308 03:34:57.779195 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdd25\" (UniqueName: \"kubernetes.io/projected/a25b753b-985a-4260-a297-02f3e4e86122-kube-api-access-wdd25\") pod \"test-operator-controller-manager-55b5ff4dbb-lt4h9\" (UID: \"a25b753b-985a-4260-a297-02f3e4e86122\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:34:57.779775 master-0 kubenswrapper[13046]: I0308 03:34:57.779741 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm"] Mar 08 03:34:57.787856 master-0 kubenswrapper[13046]: I0308 03:34:57.782439 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:34:57.796777 master-0 kubenswrapper[13046]: I0308 03:34:57.796610 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkzf\" (UniqueName: \"kubernetes.io/projected/8705b78b-c460-4314-9951-db06f96f49e3-kube-api-access-4hkzf\") pod \"telemetry-operator-controller-manager-5fdb694969-fcgrv\" (UID: \"8705b78b-c460-4314-9951-db06f96f49e3\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:34:57.797673 master-0 kubenswrapper[13046]: I0308 03:34:57.797604 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdd25\" (UniqueName: \"kubernetes.io/projected/a25b753b-985a-4260-a297-02f3e4e86122-kube-api-access-wdd25\") pod \"test-operator-controller-manager-55b5ff4dbb-lt4h9\" (UID: \"a25b753b-985a-4260-a297-02f3e4e86122\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:34:57.835712 master-0 kubenswrapper[13046]: I0308 03:34:57.835651 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5"] Mar 08 03:34:57.883300 master-0 kubenswrapper[13046]: I0308 03:34:57.883241 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4z4w\" (UniqueName: \"kubernetes.io/projected/a85493ea-63bd-4f74-bee8-e93d882d0991-kube-api-access-l4z4w\") pod \"watcher-operator-controller-manager-bccc79885-lv2t9\" (UID: \"a85493ea-63bd-4f74-bee8-e93d882d0991\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:34:57.883474 master-0 kubenswrapper[13046]: I0308 03:34:57.883310 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6q5q\" (UniqueName: \"kubernetes.io/projected/f77296db-7b56-4306-bcf1-d2cef736f49f-kube-api-access-z6q5q\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.883474 master-0 kubenswrapper[13046]: I0308 03:34:57.883350 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nnjn\" (UniqueName: \"kubernetes.io/projected/2cc8c661-4f55-495d-92bf-9075a5ecf8ab-kube-api-access-5nnjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fg92b\" (UID: \"2cc8c661-4f55-495d-92bf-9075a5ecf8ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" Mar 08 03:34:57.883474 master-0 kubenswrapper[13046]: I0308 03:34:57.883381 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.883598 master-0 kubenswrapper[13046]: I0308 03:34:57.883525 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.884230 master-0 kubenswrapper[13046]: E0308 03:34:57.884197 13046 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 03:34:57.884358 master-0 kubenswrapper[13046]: E0308 03:34:57.884348 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:34:58.384322721 +0000 UTC m=+1300.463089938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "metrics-server-cert" not found Mar 08 03:34:57.885082 master-0 kubenswrapper[13046]: E0308 03:34:57.885066 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:34:57.885178 master-0 kubenswrapper[13046]: E0308 03:34:57.885167 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:34:58.385152645 +0000 UTC m=+1300.463919862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:34:57.895094 master-0 kubenswrapper[13046]: I0308 03:34:57.895029 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj"] Mar 08 03:34:57.916264 master-0 kubenswrapper[13046]: I0308 03:34:57.913039 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6q5q\" (UniqueName: \"kubernetes.io/projected/f77296db-7b56-4306-bcf1-d2cef736f49f-kube-api-access-z6q5q\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:57.916264 master-0 kubenswrapper[13046]: I0308 03:34:57.913576 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nnjn\" (UniqueName: \"kubernetes.io/projected/2cc8c661-4f55-495d-92bf-9075a5ecf8ab-kube-api-access-5nnjn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fg92b\" (UID: \"2cc8c661-4f55-495d-92bf-9075a5ecf8ab\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" Mar 08 03:34:57.916264 master-0 kubenswrapper[13046]: I0308 03:34:57.916084 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4z4w\" (UniqueName: \"kubernetes.io/projected/a85493ea-63bd-4f74-bee8-e93d882d0991-kube-api-access-l4z4w\") pod \"watcher-operator-controller-manager-bccc79885-lv2t9\" (UID: \"a85493ea-63bd-4f74-bee8-e93d882d0991\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:34:57.978674 master-0 kubenswrapper[13046]: I0308 03:34:57.978583 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:34:57.988323 master-0 kubenswrapper[13046]: I0308 03:34:57.988229 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:34:58.009261 master-0 kubenswrapper[13046]: I0308 03:34:58.004420 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:34:58.036202 master-0 kubenswrapper[13046]: I0308 03:34:58.035773 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" Mar 08 03:34:58.088686 master-0 kubenswrapper[13046]: I0308 03:34:58.088651 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:58.090156 master-0 kubenswrapper[13046]: E0308 03:34:58.088951 13046 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:58.090156 master-0 kubenswrapper[13046]: E0308 03:34:58.089025 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert podName:1efdb13d-44bc-429f-bc09-cb520504d91c nodeName:}" failed. No retries permitted until 2026-03-08 03:34:59.089003301 +0000 UTC m=+1301.167770518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" (UID: "1efdb13d-44bc-429f-bc09-cb520504d91c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:58.206749 master-0 kubenswrapper[13046]: I0308 03:34:58.206692 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" event={"ID":"a346c90a-64e5-48d5-a428-77012a677ea6","Type":"ContainerStarted","Data":"691fe8a7abd8e5603c0dd31a2f620652b126c334d522dace1442dcb29b18d3a0"} Mar 08 03:34:58.207385 master-0 kubenswrapper[13046]: I0308 03:34:58.207360 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" event={"ID":"598125df-0908-4698-ac00-349bc90e6f9d","Type":"ContainerStarted","Data":"f25af752705dae7a3efabd677feeae3df5fe2511f588964d3ad2363d5521b954"} Mar 08 03:34:58.208537 master-0 kubenswrapper[13046]: I0308 03:34:58.208471 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" event={"ID":"494aa38f-eb0c-4f1a-9789-2c96aa18460c","Type":"ContainerStarted","Data":"e5993f39fb516f88456900d3a7e5a6a02b123a565394b56218152b8fc03920e2"} Mar 08 03:34:58.300015 master-0 kubenswrapper[13046]: I0308 03:34:58.299622 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs"] Mar 08 03:34:58.311430 master-0 kubenswrapper[13046]: W0308 03:34:58.311388 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f728375_b21a_4e8b_8c90_bd501c82d6b2.slice/crio-9c9bef041b16bf4eaf23b77a269d9ba3a9f39301d556f0db3cd2db37d9483c22 WatchSource:0}: Error finding container 9c9bef041b16bf4eaf23b77a269d9ba3a9f39301d556f0db3cd2db37d9483c22: Status 404 returned error can't find the container with id 9c9bef041b16bf4eaf23b77a269d9ba3a9f39301d556f0db3cd2db37d9483c22 Mar 08 03:34:58.312105 master-0 kubenswrapper[13046]: I0308 03:34:58.312077 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9"] Mar 08 03:34:58.312992 master-0 kubenswrapper[13046]: W0308 03:34:58.312926 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde9c4f2_1850_4139_9f22_73a12e2e66f8.slice/crio-4fb3f74ac4fc4cafa7acafcca386844d1075094d7b59cb46ab96beefaa514211 WatchSource:0}: Error finding container 4fb3f74ac4fc4cafa7acafcca386844d1075094d7b59cb46ab96beefaa514211: Status 404 returned error can't find the container with id 4fb3f74ac4fc4cafa7acafcca386844d1075094d7b59cb46ab96beefaa514211 Mar 08 03:34:58.322015 master-0 kubenswrapper[13046]: I0308 03:34:58.320009 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr"] Mar 08 03:34:58.334786 master-0 kubenswrapper[13046]: I0308 03:34:58.334395 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g"] Mar 08 03:34:58.348196 master-0 kubenswrapper[13046]: I0308 03:34:58.348091 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp"] Mar 08 03:34:58.354836 master-0 kubenswrapper[13046]: I0308 03:34:58.354768 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5"] Mar 08 03:34:58.395337 master-0 kubenswrapper[13046]: I0308 03:34:58.395272 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:34:58.395558 master-0 kubenswrapper[13046]: I0308 03:34:58.395347 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:58.395558 master-0 kubenswrapper[13046]: I0308 03:34:58.395521 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:58.395888 master-0 kubenswrapper[13046]: E0308 03:34:58.395826 13046 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:58.395938 master-0 kubenswrapper[13046]: E0308 03:34:58.395900 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert podName:17f3dac6-f1b5-4396-abe1-c2ef1e3a321e nodeName:}" failed. No retries permitted until 2026-03-08 03:35:00.395883453 +0000 UTC m=+1302.474650670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert") pod "infra-operator-controller-manager-65b58d74b-99265" (UID: "17f3dac6-f1b5-4396-abe1-c2ef1e3a321e") : secret "infra-operator-webhook-server-cert" not found Mar 08 03:34:58.395938 master-0 kubenswrapper[13046]: E0308 03:34:58.395923 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:34:58.396035 master-0 kubenswrapper[13046]: E0308 03:34:58.396011 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:34:59.395990996 +0000 UTC m=+1301.474758243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:34:58.396079 master-0 kubenswrapper[13046]: E0308 03:34:58.396068 13046 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 03:34:58.397588 master-0 kubenswrapper[13046]: E0308 03:34:58.397560 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:34:59.39613578 +0000 UTC m=+1301.474902997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "metrics-server-cert" not found Mar 08 03:34:58.860528 master-0 kubenswrapper[13046]: W0308 03:34:58.860215 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode65fbcb8_1b79_4fd3_8e12_1d0dd4b33c63.slice/crio-33cd461e71143732b51a02063305de17542c61b33ab3ed595df816e87eea356c WatchSource:0}: Error finding container 33cd461e71143732b51a02063305de17542c61b33ab3ed595df816e87eea356c: Status 404 returned error can't find the container with id 33cd461e71143732b51a02063305de17542c61b33ab3ed595df816e87eea356c Mar 08 03:34:58.879949 master-0 kubenswrapper[13046]: W0308 03:34:58.879871 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc470d5a0_46d5_4a08_bd51_7feddc4beaef.slice/crio-2f29ceeb59dc2061894841c982252f165ecd106bafd8d2ef6dfa872e112b9cb7 WatchSource:0}: Error finding container 2f29ceeb59dc2061894841c982252f165ecd106bafd8d2ef6dfa872e112b9cb7: Status 404 returned error can't find the container with id 2f29ceeb59dc2061894841c982252f165ecd106bafd8d2ef6dfa872e112b9cb7 Mar 08 03:34:58.882306 master-0 kubenswrapper[13046]: I0308 03:34:58.882257 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9"] Mar 08 03:34:58.897855 master-0 kubenswrapper[13046]: I0308 03:34:58.897273 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf"] Mar 08 03:34:58.908769 master-0 kubenswrapper[13046]: I0308 03:34:58.908447 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h"] Mar 08 03:34:58.927820 master-0 kubenswrapper[13046]: I0308 03:34:58.927690 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t"] Mar 08 03:34:58.945583 master-0 kubenswrapper[13046]: I0308 03:34:58.945532 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj"] Mar 08 03:34:58.970583 master-0 kubenswrapper[13046]: I0308 03:34:58.969561 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl"] Mar 08 03:34:58.980583 master-0 kubenswrapper[13046]: I0308 03:34:58.979148 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6"] Mar 08 03:34:59.128916 master-0 kubenswrapper[13046]: I0308 03:34:59.128620 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:34:59.128916 master-0 kubenswrapper[13046]: E0308 03:34:59.128855 13046 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:59.128916 master-0 kubenswrapper[13046]: E0308 03:34:59.128901 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert podName:1efdb13d-44bc-429f-bc09-cb520504d91c nodeName:}" failed. No retries permitted until 2026-03-08 03:35:01.128887829 +0000 UTC m=+1303.207655036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" (UID: "1efdb13d-44bc-429f-bc09-cb520504d91c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:34:59.237080 master-0 kubenswrapper[13046]: I0308 03:34:59.232828 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9"] Mar 08 03:34:59.244544 master-0 kubenswrapper[13046]: I0308 03:34:59.244361 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" event={"ID":"e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63","Type":"ContainerStarted","Data":"33cd461e71143732b51a02063305de17542c61b33ab3ed595df816e87eea356c"} Mar 08 03:34:59.245886 master-0 kubenswrapper[13046]: W0308 03:34:59.245308 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cc8c661_4f55_495d_92bf_9075a5ecf8ab.slice/crio-d5addc29cf565abc4052c24e51981be9a56423a659a67b7950671fc3e93784da WatchSource:0}: Error finding container d5addc29cf565abc4052c24e51981be9a56423a659a67b7950671fc3e93784da: Status 404 returned error can't find the container with id d5addc29cf565abc4052c24e51981be9a56423a659a67b7950671fc3e93784da Mar 08 03:34:59.248250 master-0 kubenswrapper[13046]: I0308 03:34:59.247573 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" event={"ID":"ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69","Type":"ContainerStarted","Data":"4db05a0b8ffa38f376cc28771bf7ae6b78ee9c7ce84ce058f3b38bc1e0356c92"} Mar 08 03:34:59.249368 master-0 kubenswrapper[13046]: I0308 03:34:59.249245 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" event={"ID":"815174b8-3094-4bd1-bc8c-4b47adcfdcea","Type":"ContainerStarted","Data":"9a278f6c132efef78c913fcddab40b3e0d515382dd61af256ec88d6fe03edaa2"} Mar 08 03:34:59.249704 master-0 kubenswrapper[13046]: W0308 03:34:59.249667 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8705b78b_c460_4314_9951_db06f96f49e3.slice/crio-e9255b12bf362ab993c04164de1c8c829670a48c6cd5d66bac3143c9dc001e65 WatchSource:0}: Error finding container e9255b12bf362ab993c04164de1c8c829670a48c6cd5d66bac3143c9dc001e65: Status 404 returned error can't find the container with id e9255b12bf362ab993c04164de1c8c829670a48c6cd5d66bac3143c9dc001e65 Mar 08 03:34:59.250525 master-0 kubenswrapper[13046]: W0308 03:34:59.250470 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda25b753b_985a_4260_a297_02f3e4e86122.slice/crio-e4f45815871fad4af1bb44f42a54f783b3ede8f9b4afae397f4267299d0c9ed5 WatchSource:0}: Error finding container e4f45815871fad4af1bb44f42a54f783b3ede8f9b4afae397f4267299d0c9ed5: Status 404 returned error can't find the container with id e4f45815871fad4af1bb44f42a54f783b3ede8f9b4afae397f4267299d0c9ed5 Mar 08 03:34:59.252708 master-0 kubenswrapper[13046]: I0308 03:34:59.251996 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" event={"ID":"389e5461-3240-444e-8230-621192f5bc87","Type":"ContainerStarted","Data":"595359ff63f2dd63a17cc8ab330ea76c56c62e2a0d2f4dc938a2415bb1b24cc1"} Mar 08 03:34:59.255419 master-0 kubenswrapper[13046]: E0308 03:34:59.254696 13046 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hkzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-fcgrv_openstack-operators(8705b78b-c460-4314-9951-db06f96f49e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 03:34:59.255419 master-0 kubenswrapper[13046]: I0308 03:34:59.255315 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" event={"ID":"9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3","Type":"ContainerStarted","Data":"9ee7e5c0846423d672161c91d1ebc467aa5cfb90a731208f42a10f246223e66c"} Mar 08 03:34:59.256244 master-0 kubenswrapper[13046]: E0308 03:34:59.256165 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" podUID="8705b78b-c460-4314-9951-db06f96f49e3" Mar 08 03:34:59.258474 master-0 kubenswrapper[13046]: I0308 03:34:59.258440 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" event={"ID":"22e4bfb2-d3d5-420e-b79e-d36ead75b302","Type":"ContainerStarted","Data":"d6263958d47a42ad832230e0790a3645446568d681fe133fb3b828bfb0b0f016"} Mar 08 03:34:59.270021 master-0 kubenswrapper[13046]: E0308 03:34:59.269927 13046 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdd25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-lt4h9_openstack-operators(a25b753b-985a-4260-a297-02f3e4e86122): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 03:34:59.271049 master-0 kubenswrapper[13046]: E0308 03:34:59.271013 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" podUID="a25b753b-985a-4260-a297-02f3e4e86122" Mar 08 03:34:59.276538 master-0 kubenswrapper[13046]: I0308 03:34:59.274846 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" event={"ID":"42aeff7c-1920-43a9-aaaf-c8f9f7d54752","Type":"ContainerStarted","Data":"cdb474c8b82b407c61b004d6371841c551a78cdb6ebafc7ac9bd456ac3bd9474"} Mar 08 03:34:59.282355 master-0 kubenswrapper[13046]: I0308 03:34:59.280888 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" event={"ID":"5adb5035-1d71-4fc4-8174-b47b3f367be2","Type":"ContainerStarted","Data":"a311c92bc10916eef2fec0c14a255bc46a3ce070bfa5394840fd34f357691559"} Mar 08 03:34:59.287515 master-0 kubenswrapper[13046]: I0308 03:34:59.286582 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b"] Mar 08 03:34:59.289965 master-0 kubenswrapper[13046]: E0308 03:34:59.281007 13046 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4z4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-lv2t9_openstack-operators(a85493ea-63bd-4f74-bee8-e93d882d0991): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 03:34:59.293517 master-0 kubenswrapper[13046]: I0308 03:34:59.291745 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" event={"ID":"9a9e2b01-2399-4ea4-9eea-52d8e7050649","Type":"ContainerStarted","Data":"f457cb2c98aab8d895fd0ea39ed32ce2854effd3c3b0b2c4e2444e75a01626c9"} Mar 08 03:34:59.293517 master-0 kubenswrapper[13046]: E0308 03:34:59.291817 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" podUID="a85493ea-63bd-4f74-bee8-e93d882d0991" Mar 08 03:34:59.298559 master-0 kubenswrapper[13046]: I0308 03:34:59.298207 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv"] Mar 08 03:34:59.308503 master-0 kubenswrapper[13046]: I0308 03:34:59.306308 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" event={"ID":"7573117d-5c36-4c09-b193-3bf1fbc4c487","Type":"ContainerStarted","Data":"ec8e4506675c10cf913ceffd89a0a9777f80ae5d6bb798c757a4ac1bca391834"} Mar 08 03:34:59.313061 master-0 kubenswrapper[13046]: I0308 03:34:59.310386 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9"] Mar 08 03:34:59.326279 master-0 kubenswrapper[13046]: I0308 03:34:59.326202 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" event={"ID":"c470d5a0-46d5-4a08-bd51-7feddc4beaef","Type":"ContainerStarted","Data":"2f29ceeb59dc2061894841c982252f165ecd106bafd8d2ef6dfa872e112b9cb7"} Mar 08 03:34:59.347452 master-0 kubenswrapper[13046]: I0308 03:34:59.347339 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" event={"ID":"fde9c4f2-1850-4139-9f22-73a12e2e66f8","Type":"ContainerStarted","Data":"4fb3f74ac4fc4cafa7acafcca386844d1075094d7b59cb46ab96beefaa514211"} Mar 08 03:34:59.351977 master-0 kubenswrapper[13046]: I0308 03:34:59.351946 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" event={"ID":"5f728375-b21a-4e8b-8c90-bd501c82d6b2","Type":"ContainerStarted","Data":"9c9bef041b16bf4eaf23b77a269d9ba3a9f39301d556f0db3cd2db37d9483c22"} Mar 08 03:34:59.439337 master-0 kubenswrapper[13046]: I0308 03:34:59.439271 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:59.439675 master-0 kubenswrapper[13046]: I0308 03:34:59.439403 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:34:59.439675 master-0 kubenswrapper[13046]: E0308 03:34:59.439570 13046 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 03:34:59.439675 master-0 kubenswrapper[13046]: E0308 03:34:59.439618 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:01.439603688 +0000 UTC m=+1303.518370905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "metrics-server-cert" not found Mar 08 03:34:59.439939 master-0 kubenswrapper[13046]: E0308 03:34:59.439759 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:34:59.439939 master-0 kubenswrapper[13046]: E0308 03:34:59.439823 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:01.439807093 +0000 UTC m=+1303.518574300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:35:00.363637 master-0 kubenswrapper[13046]: I0308 03:35:00.363540 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" event={"ID":"a25b753b-985a-4260-a297-02f3e4e86122","Type":"ContainerStarted","Data":"e4f45815871fad4af1bb44f42a54f783b3ede8f9b4afae397f4267299d0c9ed5"} Mar 08 03:35:00.366357 master-0 kubenswrapper[13046]: E0308 03:35:00.364927 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" podUID="a25b753b-985a-4260-a297-02f3e4e86122" Mar 08 03:35:00.366357 master-0 kubenswrapper[13046]: I0308 03:35:00.365534 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" event={"ID":"2cc8c661-4f55-495d-92bf-9075a5ecf8ab","Type":"ContainerStarted","Data":"d5addc29cf565abc4052c24e51981be9a56423a659a67b7950671fc3e93784da"} Mar 08 03:35:00.376173 master-0 kubenswrapper[13046]: I0308 03:35:00.376122 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" event={"ID":"a85493ea-63bd-4f74-bee8-e93d882d0991","Type":"ContainerStarted","Data":"3da3796a3ff579aed96ec7d898bf21c7d832922f348500e9c29b708e707ff208"} Mar 08 03:35:00.377599 master-0 kubenswrapper[13046]: E0308 03:35:00.377242 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" podUID="a85493ea-63bd-4f74-bee8-e93d882d0991" Mar 08 03:35:00.384627 master-0 kubenswrapper[13046]: I0308 03:35:00.381419 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" event={"ID":"8705b78b-c460-4314-9951-db06f96f49e3","Type":"ContainerStarted","Data":"e9255b12bf362ab993c04164de1c8c829670a48c6cd5d66bac3143c9dc001e65"} Mar 08 03:35:00.384627 master-0 kubenswrapper[13046]: E0308 03:35:00.383836 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" podUID="8705b78b-c460-4314-9951-db06f96f49e3" Mar 08 03:35:00.466448 master-0 kubenswrapper[13046]: I0308 03:35:00.466385 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:00.466724 master-0 kubenswrapper[13046]: E0308 03:35:00.466598 13046 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 03:35:00.466724 master-0 kubenswrapper[13046]: E0308 03:35:00.466678 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert podName:17f3dac6-f1b5-4396-abe1-c2ef1e3a321e nodeName:}" failed. No retries permitted until 2026-03-08 03:35:04.466659682 +0000 UTC m=+1306.545426899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert") pod "infra-operator-controller-manager-65b58d74b-99265" (UID: "17f3dac6-f1b5-4396-abe1-c2ef1e3a321e") : secret "infra-operator-webhook-server-cert" not found Mar 08 03:35:01.180043 master-0 kubenswrapper[13046]: I0308 03:35:01.179976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:01.180514 master-0 kubenswrapper[13046]: E0308 03:35:01.180445 13046 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:35:01.180664 master-0 kubenswrapper[13046]: E0308 03:35:01.180609 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert podName:1efdb13d-44bc-429f-bc09-cb520504d91c nodeName:}" failed. No retries permitted until 2026-03-08 03:35:05.180582308 +0000 UTC m=+1307.259349555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" (UID: "1efdb13d-44bc-429f-bc09-cb520504d91c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:35:01.391247 master-0 kubenswrapper[13046]: E0308 03:35:01.391105 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" podUID="a85493ea-63bd-4f74-bee8-e93d882d0991" Mar 08 03:35:01.391693 master-0 kubenswrapper[13046]: E0308 03:35:01.391632 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" podUID="a25b753b-985a-4260-a297-02f3e4e86122" Mar 08 03:35:01.392130 master-0 kubenswrapper[13046]: E0308 03:35:01.392096 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" podUID="8705b78b-c460-4314-9951-db06f96f49e3" Mar 08 03:35:01.486221 master-0 kubenswrapper[13046]: I0308 03:35:01.486113 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:01.486368 master-0 kubenswrapper[13046]: E0308 03:35:01.486286 13046 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 03:35:01.486496 master-0 kubenswrapper[13046]: E0308 03:35:01.486371 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:05.486352298 +0000 UTC m=+1307.565119515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "metrics-server-cert" not found Mar 08 03:35:01.486638 master-0 kubenswrapper[13046]: I0308 03:35:01.486580 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:01.486700 master-0 kubenswrapper[13046]: E0308 03:35:01.486680 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:35:01.486753 master-0 kubenswrapper[13046]: E0308 03:35:01.486718 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:05.486706498 +0000 UTC m=+1307.565473815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:35:04.565513 master-0 kubenswrapper[13046]: I0308 03:35:04.565437 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:04.566669 master-0 kubenswrapper[13046]: E0308 03:35:04.565954 13046 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 03:35:04.566669 master-0 kubenswrapper[13046]: E0308 03:35:04.566030 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert podName:17f3dac6-f1b5-4396-abe1-c2ef1e3a321e nodeName:}" failed. No retries permitted until 2026-03-08 03:35:12.566004348 +0000 UTC m=+1314.644771605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert") pod "infra-operator-controller-manager-65b58d74b-99265" (UID: "17f3dac6-f1b5-4396-abe1-c2ef1e3a321e") : secret "infra-operator-webhook-server-cert" not found Mar 08 03:35:05.181247 master-0 kubenswrapper[13046]: I0308 03:35:05.181183 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:05.181458 master-0 kubenswrapper[13046]: E0308 03:35:05.181341 13046 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:35:05.181458 master-0 kubenswrapper[13046]: E0308 03:35:05.181430 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert podName:1efdb13d-44bc-429f-bc09-cb520504d91c nodeName:}" failed. No retries permitted until 2026-03-08 03:35:13.181402067 +0000 UTC m=+1315.260169314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" (UID: "1efdb13d-44bc-429f-bc09-cb520504d91c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 03:35:05.587049 master-0 kubenswrapper[13046]: I0308 03:35:05.586926 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:05.587049 master-0 kubenswrapper[13046]: I0308 03:35:05.587047 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:05.587646 master-0 kubenswrapper[13046]: E0308 03:35:05.587207 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:35:05.587646 master-0 kubenswrapper[13046]: E0308 03:35:05.587210 13046 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 03:35:05.587646 master-0 kubenswrapper[13046]: E0308 03:35:05.587305 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:13.587281859 +0000 UTC m=+1315.666049086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:35:05.587646 master-0 kubenswrapper[13046]: E0308 03:35:05.587419 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:13.587389872 +0000 UTC m=+1315.666157169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "metrics-server-cert" not found Mar 08 03:35:12.638643 master-0 kubenswrapper[13046]: I0308 03:35:12.638558 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:12.642863 master-0 kubenswrapper[13046]: I0308 03:35:12.642809 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17f3dac6-f1b5-4396-abe1-c2ef1e3a321e-cert\") pod \"infra-operator-controller-manager-65b58d74b-99265\" (UID: \"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:12.691191 master-0 kubenswrapper[13046]: I0308 03:35:12.691129 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:13.250555 master-0 kubenswrapper[13046]: I0308 03:35:13.250416 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:13.256005 master-0 kubenswrapper[13046]: I0308 03:35:13.255930 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1efdb13d-44bc-429f-bc09-cb520504d91c-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6\" (UID: \"1efdb13d-44bc-429f-bc09-cb520504d91c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:13.293965 master-0 kubenswrapper[13046]: I0308 03:35:13.293917 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:13.658425 master-0 kubenswrapper[13046]: I0308 03:35:13.658322 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:13.659377 master-0 kubenswrapper[13046]: I0308 03:35:13.658561 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:13.659377 master-0 kubenswrapper[13046]: E0308 03:35:13.658742 13046 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 03:35:13.659377 master-0 kubenswrapper[13046]: E0308 03:35:13.658814 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs podName:f77296db-7b56-4306-bcf1-d2cef736f49f nodeName:}" failed. No retries permitted until 2026-03-08 03:35:29.658793972 +0000 UTC m=+1331.737561209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-m2zkw" (UID: "f77296db-7b56-4306-bcf1-d2cef736f49f") : secret "webhook-server-cert" not found Mar 08 03:35:13.662164 master-0 kubenswrapper[13046]: I0308 03:35:13.662138 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:18.075366 master-0 kubenswrapper[13046]: I0308 03:35:18.075242 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6"] Mar 08 03:35:18.619072 master-0 kubenswrapper[13046]: I0308 03:35:18.618996 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" event={"ID":"1efdb13d-44bc-429f-bc09-cb520504d91c","Type":"ContainerStarted","Data":"181d06eed0d01ee5daa4f87774d81799845e53346419c984954a9fd1fae159f1"} Mar 08 03:35:19.396211 master-0 kubenswrapper[13046]: I0308 03:35:19.396160 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-99265"] Mar 08 03:35:19.615331 master-0 kubenswrapper[13046]: W0308 03:35:19.615290 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f3dac6_f1b5_4396_abe1_c2ef1e3a321e.slice/crio-c6a3c2565398dbcb0b05410e8a511b6997aa7ae3b2f53d0cbd712190c2583182 WatchSource:0}: Error finding container c6a3c2565398dbcb0b05410e8a511b6997aa7ae3b2f53d0cbd712190c2583182: Status 404 returned error can't find the container with id c6a3c2565398dbcb0b05410e8a511b6997aa7ae3b2f53d0cbd712190c2583182 Mar 08 03:35:19.630232 master-0 kubenswrapper[13046]: I0308 03:35:19.630177 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" event={"ID":"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e","Type":"ContainerStarted","Data":"c6a3c2565398dbcb0b05410e8a511b6997aa7ae3b2f53d0cbd712190c2583182"} Mar 08 03:35:20.655805 master-0 kubenswrapper[13046]: I0308 03:35:20.655760 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" event={"ID":"a346c90a-64e5-48d5-a428-77012a677ea6","Type":"ContainerStarted","Data":"27f1888b10937cb545795daf28e21f274fd73f604712e39e0079c865275c24aa"} Mar 08 03:35:20.656155 master-0 kubenswrapper[13046]: I0308 03:35:20.655876 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:35:20.661543 master-0 kubenswrapper[13046]: I0308 03:35:20.661464 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" event={"ID":"ba69bf67-c6b4-4ce3-9bb5-dc8b5b7bfc69","Type":"ContainerStarted","Data":"3148b54a358ffc41c650a2ad9dddfe6b0a8321f07d6d93421831369017c23fa2"} Mar 08 03:35:20.662273 master-0 kubenswrapper[13046]: I0308 03:35:20.662236 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:35:20.680334 master-0 kubenswrapper[13046]: I0308 03:35:20.678784 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" event={"ID":"815174b8-3094-4bd1-bc8c-4b47adcfdcea","Type":"ContainerStarted","Data":"926cb1f0c432c0a5ea3ec123976c8bb5b09dda32df3e223db1c3e6008194e35e"} Mar 08 03:35:20.681171 master-0 kubenswrapper[13046]: I0308 03:35:20.680822 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:35:20.704584 master-0 kubenswrapper[13046]: I0308 03:35:20.702449 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" event={"ID":"2cc8c661-4f55-495d-92bf-9075a5ecf8ab","Type":"ContainerStarted","Data":"3544234bc2e26a15c3c8df2154269b1feea9dcd73b900dec7cbd4ab18061d4f0"} Mar 08 03:35:20.710686 master-0 kubenswrapper[13046]: I0308 03:35:20.709407 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" podStartSLOduration=5.761205097 podStartE2EDuration="24.709384599s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:57.601082078 +0000 UTC m=+1299.679849285" lastFinishedPulling="2026-03-08 03:35:16.54926157 +0000 UTC m=+1318.628028787" observedRunningTime="2026-03-08 03:35:20.692778948 +0000 UTC m=+1322.771546155" watchObservedRunningTime="2026-03-08 03:35:20.709384599 +0000 UTC m=+1322.788151826" Mar 08 03:35:20.729184 master-0 kubenswrapper[13046]: I0308 03:35:20.729074 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" event={"ID":"9f46b79c-0d18-49ec-ad91-d3cfd67ff3a3","Type":"ContainerStarted","Data":"93629e7e68f69f84465f2e06b90db55526fa5abb9de4d5c4abe467cc8f0308e3"} Mar 08 03:35:20.733046 master-0 kubenswrapper[13046]: I0308 03:35:20.731788 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:35:20.733308 master-0 kubenswrapper[13046]: I0308 03:35:20.733209 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" event={"ID":"42aeff7c-1920-43a9-aaaf-c8f9f7d54752","Type":"ContainerStarted","Data":"a9eb3482b37ad344bee66affa4109df14fdcbd27dfbed27de313c9b35908a164"} Mar 08 03:35:20.735167 master-0 kubenswrapper[13046]: I0308 03:35:20.735029 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:35:20.756304 master-0 kubenswrapper[13046]: I0308 03:35:20.756240 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" event={"ID":"fde9c4f2-1850-4139-9f22-73a12e2e66f8","Type":"ContainerStarted","Data":"e61b23479858a0ca5bd37203e24e3732daae4ca49118ee83518c61b894776d3a"} Mar 08 03:35:20.757197 master-0 kubenswrapper[13046]: I0308 03:35:20.757060 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:35:20.758360 master-0 kubenswrapper[13046]: I0308 03:35:20.758284 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" podStartSLOduration=7.056894863 podStartE2EDuration="24.758268294s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.871179448 +0000 UTC m=+1300.949946665" lastFinishedPulling="2026-03-08 03:35:16.572552879 +0000 UTC m=+1318.651320096" observedRunningTime="2026-03-08 03:35:20.757276746 +0000 UTC m=+1322.836043963" watchObservedRunningTime="2026-03-08 03:35:20.758268294 +0000 UTC m=+1322.837035511" Mar 08 03:35:20.784529 master-0 kubenswrapper[13046]: I0308 03:35:20.774693 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" event={"ID":"e65fbcb8-1b79-4fd3-8e12-1d0dd4b33c63","Type":"ContainerStarted","Data":"b17891f7e068f1580fe98640a45143062cc47892ee420480240f83161863c3a9"} Mar 08 03:35:20.784529 master-0 kubenswrapper[13046]: I0308 03:35:20.775735 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:35:20.786539 master-0 kubenswrapper[13046]: I0308 03:35:20.785591 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" event={"ID":"5adb5035-1d71-4fc4-8174-b47b3f367be2","Type":"ContainerStarted","Data":"e1ed3e75c492dbadaa3ef9cfc79b07cfb9a5d6b96a840ceb954c83ac36038319"} Mar 08 03:35:20.786539 master-0 kubenswrapper[13046]: I0308 03:35:20.786373 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:35:20.798206 master-0 kubenswrapper[13046]: I0308 03:35:20.798150 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" event={"ID":"9a9e2b01-2399-4ea4-9eea-52d8e7050649","Type":"ContainerStarted","Data":"55a55662b27eea26d25e13b6c154b910623689ab83575617c770ec9e2fcafdce"} Mar 08 03:35:20.798900 master-0 kubenswrapper[13046]: I0308 03:35:20.798878 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:35:20.809512 master-0 kubenswrapper[13046]: I0308 03:35:20.800824 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" podStartSLOduration=4.719232256 podStartE2EDuration="24.80080202s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.860316161 +0000 UTC m=+1300.939083378" lastFinishedPulling="2026-03-08 03:35:18.941885915 +0000 UTC m=+1321.020653142" observedRunningTime="2026-03-08 03:35:20.796387455 +0000 UTC m=+1322.875154672" watchObservedRunningTime="2026-03-08 03:35:20.80080202 +0000 UTC m=+1322.879569237" Mar 08 03:35:20.815128 master-0 kubenswrapper[13046]: I0308 03:35:20.810945 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" event={"ID":"389e5461-3240-444e-8230-621192f5bc87","Type":"ContainerStarted","Data":"eb38daacb3cbe55fa1d2a440bb0553260fcc77d2427ca5828558311a81b71625"} Mar 08 03:35:20.815128 master-0 kubenswrapper[13046]: I0308 03:35:20.811539 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:35:20.839577 master-0 kubenswrapper[13046]: I0308 03:35:20.839499 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" podStartSLOduration=4.200914026 podStartE2EDuration="24.839458615s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.96492519 +0000 UTC m=+1301.043692417" lastFinishedPulling="2026-03-08 03:35:19.603469789 +0000 UTC m=+1321.682237006" observedRunningTime="2026-03-08 03:35:20.82801498 +0000 UTC m=+1322.906782197" watchObservedRunningTime="2026-03-08 03:35:20.839458615 +0000 UTC m=+1322.918225832" Mar 08 03:35:20.868282 master-0 kubenswrapper[13046]: I0308 03:35:20.864956 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" podStartSLOduration=7.162276229 podStartE2EDuration="24.864939297s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.868876333 +0000 UTC m=+1300.947643550" lastFinishedPulling="2026-03-08 03:35:16.571539401 +0000 UTC m=+1318.650306618" observedRunningTime="2026-03-08 03:35:20.863318421 +0000 UTC m=+1322.942085638" watchObservedRunningTime="2026-03-08 03:35:20.864939297 +0000 UTC m=+1322.943706514" Mar 08 03:35:20.951505 master-0 kubenswrapper[13046]: I0308 03:35:20.950939 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" podStartSLOduration=7.279567453 podStartE2EDuration="24.950922334s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.901726463 +0000 UTC m=+1300.980493680" lastFinishedPulling="2026-03-08 03:35:16.573081344 +0000 UTC m=+1318.651848561" observedRunningTime="2026-03-08 03:35:20.913758961 +0000 UTC m=+1322.992526178" watchObservedRunningTime="2026-03-08 03:35:20.950922334 +0000 UTC m=+1323.029689551" Mar 08 03:35:20.951505 master-0 kubenswrapper[13046]: I0308 03:35:20.951227 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" podStartSLOduration=5.635276615 podStartE2EDuration="24.951221213s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.294979408 +0000 UTC m=+1300.373746625" lastFinishedPulling="2026-03-08 03:35:17.610923966 +0000 UTC m=+1319.689691223" observedRunningTime="2026-03-08 03:35:20.945067319 +0000 UTC m=+1323.023834536" watchObservedRunningTime="2026-03-08 03:35:20.951221213 +0000 UTC m=+1323.029988430" Mar 08 03:35:21.101506 master-0 kubenswrapper[13046]: I0308 03:35:21.099053 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" podStartSLOduration=6.849131504 podStartE2EDuration="25.099033763s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.322245129 +0000 UTC m=+1300.401012346" lastFinishedPulling="2026-03-08 03:35:16.572147398 +0000 UTC m=+1318.650914605" observedRunningTime="2026-03-08 03:35:21.082403362 +0000 UTC m=+1323.161170579" watchObservedRunningTime="2026-03-08 03:35:21.099033763 +0000 UTC m=+1323.177800980" Mar 08 03:35:21.113500 master-0 kubenswrapper[13046]: I0308 03:35:21.111163 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fg92b" podStartSLOduration=3.721667541 podStartE2EDuration="24.111141606s" podCreationTimestamp="2026-03-08 03:34:57 +0000 UTC" firstStartedPulling="2026-03-08 03:34:59.247167185 +0000 UTC m=+1301.325934402" lastFinishedPulling="2026-03-08 03:35:19.63664125 +0000 UTC m=+1321.715408467" observedRunningTime="2026-03-08 03:35:21.014228259 +0000 UTC m=+1323.092995476" watchObservedRunningTime="2026-03-08 03:35:21.111141606 +0000 UTC m=+1323.189908823" Mar 08 03:35:21.144503 master-0 kubenswrapper[13046]: I0308 03:35:21.144326 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" podStartSLOduration=6.879062364 podStartE2EDuration="25.144280626s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.307178993 +0000 UTC m=+1300.385946210" lastFinishedPulling="2026-03-08 03:35:16.572397255 +0000 UTC m=+1318.651164472" observedRunningTime="2026-03-08 03:35:21.136125684 +0000 UTC m=+1323.214892901" watchObservedRunningTime="2026-03-08 03:35:21.144280626 +0000 UTC m=+1323.223047833" Mar 08 03:35:21.823003 master-0 kubenswrapper[13046]: I0308 03:35:21.822952 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" event={"ID":"a85493ea-63bd-4f74-bee8-e93d882d0991","Type":"ContainerStarted","Data":"e0fc28c23f546bc7a723cc265f448fd6e27622e2e1f7a3bd3ef73ab6af29fa7c"} Mar 08 03:35:21.824026 master-0 kubenswrapper[13046]: I0308 03:35:21.824003 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.828446 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" event={"ID":"c470d5a0-46d5-4a08-bd51-7feddc4beaef","Type":"ContainerStarted","Data":"4b79d676c0d4cedc0721d3089b79bb2bff3737c18e09744d7fb61d4b97c4c566"} Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.828928 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.830262 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" event={"ID":"8705b78b-c460-4314-9951-db06f96f49e3","Type":"ContainerStarted","Data":"ffa33b7aeb24795464fe15154bc0555bbca16e4ba7db660f2bff596f301b663c"} Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.830800 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.831988 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" event={"ID":"22e4bfb2-d3d5-420e-b79e-d36ead75b302","Type":"ContainerStarted","Data":"6c366b0d5f769a7186d2541cf04189655d1ceb86a18bb3c1143539b5e3801dda"} Mar 08 03:35:21.832503 master-0 kubenswrapper[13046]: I0308 03:35:21.832368 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:35:21.839503 master-0 kubenswrapper[13046]: I0308 03:35:21.834300 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" event={"ID":"5f728375-b21a-4e8b-8c90-bd501c82d6b2","Type":"ContainerStarted","Data":"f34260c4056edb04c8efe0d5445beff17b6b0f1f7b5564394714f40e1d090766"} Mar 08 03:35:21.839503 master-0 kubenswrapper[13046]: I0308 03:35:21.834752 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:35:21.839503 master-0 kubenswrapper[13046]: I0308 03:35:21.835973 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" event={"ID":"a25b753b-985a-4260-a297-02f3e4e86122","Type":"ContainerStarted","Data":"551fe5a36979d998d350ecb9d81d862bc886d8cee9719e797511bad1fe7b933b"} Mar 08 03:35:21.839503 master-0 kubenswrapper[13046]: I0308 03:35:21.836327 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:35:21.849505 master-0 kubenswrapper[13046]: I0308 03:35:21.844416 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" podStartSLOduration=5.185175539 podStartE2EDuration="25.844398272s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:59.280860458 +0000 UTC m=+1301.359627675" lastFinishedPulling="2026-03-08 03:35:19.940083181 +0000 UTC m=+1322.018850408" observedRunningTime="2026-03-08 03:35:21.844259208 +0000 UTC m=+1323.923026425" watchObservedRunningTime="2026-03-08 03:35:21.844398272 +0000 UTC m=+1323.923165489" Mar 08 03:35:21.864504 master-0 kubenswrapper[13046]: I0308 03:35:21.859176 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" event={"ID":"598125df-0908-4698-ac00-349bc90e6f9d","Type":"ContainerStarted","Data":"9c3ef9931cac012d938600e3ba92b766ecda1ca9dee7482ce812dfa43604a2de"} Mar 08 03:35:21.864504 master-0 kubenswrapper[13046]: I0308 03:35:21.859438 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:35:21.864504 master-0 kubenswrapper[13046]: I0308 03:35:21.862623 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" podStartSLOduration=7.597607083 podStartE2EDuration="25.862607348s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.307204544 +0000 UTC m=+1300.385971751" lastFinishedPulling="2026-03-08 03:35:16.572204799 +0000 UTC m=+1318.650972016" observedRunningTime="2026-03-08 03:35:21.218192841 +0000 UTC m=+1323.296960058" watchObservedRunningTime="2026-03-08 03:35:21.862607348 +0000 UTC m=+1323.941374565" Mar 08 03:35:21.875494 master-0 kubenswrapper[13046]: I0308 03:35:21.868901 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" event={"ID":"7573117d-5c36-4c09-b193-3bf1fbc4c487","Type":"ContainerStarted","Data":"86555e1a7b130f64f4ddc968aeaf6f3360f708e10b6009b86930f06f07e47d39"} Mar 08 03:35:21.875494 master-0 kubenswrapper[13046]: I0308 03:35:21.869657 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:35:21.886507 master-0 kubenswrapper[13046]: I0308 03:35:21.883617 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" podStartSLOduration=5.25983322 podStartE2EDuration="25.883600833s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.319303516 +0000 UTC m=+1300.398070733" lastFinishedPulling="2026-03-08 03:35:18.943071129 +0000 UTC m=+1321.021838346" observedRunningTime="2026-03-08 03:35:21.875876594 +0000 UTC m=+1323.954643811" watchObservedRunningTime="2026-03-08 03:35:21.883600833 +0000 UTC m=+1323.962368050" Mar 08 03:35:21.886507 master-0 kubenswrapper[13046]: I0308 03:35:21.886250 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" event={"ID":"494aa38f-eb0c-4f1a-9789-2c96aa18460c","Type":"ContainerStarted","Data":"e8ed298259df7c8d69795cfaeb71770f907392d67c6c7761d923e2dd9b5bf192"} Mar 08 03:35:21.886507 master-0 kubenswrapper[13046]: I0308 03:35:21.886287 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:35:21.915500 master-0 kubenswrapper[13046]: I0308 03:35:21.915067 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" podStartSLOduration=5.476027144 podStartE2EDuration="25.915043055s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:59.269780294 +0000 UTC m=+1301.348547511" lastFinishedPulling="2026-03-08 03:35:19.708796205 +0000 UTC m=+1321.787563422" observedRunningTime="2026-03-08 03:35:21.908574131 +0000 UTC m=+1323.987341348" watchObservedRunningTime="2026-03-08 03:35:21.915043055 +0000 UTC m=+1323.993810272" Mar 08 03:35:21.946500 master-0 kubenswrapper[13046]: I0308 03:35:21.945649 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" podStartSLOduration=5.568500337 podStartE2EDuration="25.935192266s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:59.254209754 +0000 UTC m=+1301.332976971" lastFinishedPulling="2026-03-08 03:35:19.620901683 +0000 UTC m=+1321.699668900" observedRunningTime="2026-03-08 03:35:21.929886205 +0000 UTC m=+1324.008653412" watchObservedRunningTime="2026-03-08 03:35:21.935192266 +0000 UTC m=+1324.013959483" Mar 08 03:35:21.972506 master-0 kubenswrapper[13046]: I0308 03:35:21.972229 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" podStartSLOduration=4.675726393 podStartE2EDuration="25.972209525s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.307176183 +0000 UTC m=+1300.385943400" lastFinishedPulling="2026-03-08 03:35:19.603659315 +0000 UTC m=+1321.682426532" observedRunningTime="2026-03-08 03:35:21.96002395 +0000 UTC m=+1324.038791167" watchObservedRunningTime="2026-03-08 03:35:21.972209525 +0000 UTC m=+1324.050976742" Mar 08 03:35:22.005501 master-0 kubenswrapper[13046]: I0308 03:35:22.004671 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" podStartSLOduration=8.314463261 podStartE2EDuration="26.004651535s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.883215179 +0000 UTC m=+1300.961982396" lastFinishedPulling="2026-03-08 03:35:16.573403453 +0000 UTC m=+1318.652170670" observedRunningTime="2026-03-08 03:35:21.997492612 +0000 UTC m=+1324.076259829" watchObservedRunningTime="2026-03-08 03:35:22.004651535 +0000 UTC m=+1324.083418752" Mar 08 03:35:22.034161 master-0 kubenswrapper[13046]: I0308 03:35:22.034085 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" podStartSLOduration=6.83570274 podStartE2EDuration="26.034066789s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:57.349859282 +0000 UTC m=+1299.428626499" lastFinishedPulling="2026-03-08 03:35:16.548223331 +0000 UTC m=+1318.626990548" observedRunningTime="2026-03-08 03:35:22.032802013 +0000 UTC m=+1324.111569240" watchObservedRunningTime="2026-03-08 03:35:22.034066789 +0000 UTC m=+1324.112834006" Mar 08 03:35:22.070946 master-0 kubenswrapper[13046]: I0308 03:35:22.070859 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" podStartSLOduration=5.998480967 podStartE2EDuration="26.07081487s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:58.870625193 +0000 UTC m=+1300.949392410" lastFinishedPulling="2026-03-08 03:35:18.942959066 +0000 UTC m=+1321.021726313" observedRunningTime="2026-03-08 03:35:22.069708549 +0000 UTC m=+1324.148475766" watchObservedRunningTime="2026-03-08 03:35:22.07081487 +0000 UTC m=+1324.149582087" Mar 08 03:35:22.103436 master-0 kubenswrapper[13046]: I0308 03:35:22.103363 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" podStartSLOduration=7.212297877 podStartE2EDuration="26.103347553s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:34:57.657195096 +0000 UTC m=+1299.735962313" lastFinishedPulling="2026-03-08 03:35:16.548244772 +0000 UTC m=+1318.627011989" observedRunningTime="2026-03-08 03:35:22.097883098 +0000 UTC m=+1324.176650315" watchObservedRunningTime="2026-03-08 03:35:22.103347553 +0000 UTC m=+1324.182114770" Mar 08 03:35:23.911094 master-0 kubenswrapper[13046]: I0308 03:35:23.910955 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" event={"ID":"1efdb13d-44bc-429f-bc09-cb520504d91c","Type":"ContainerStarted","Data":"5074ea4962c6935c037a18e7acab586ae436523373e56f66a6d7fe0e04193847"} Mar 08 03:35:23.911094 master-0 kubenswrapper[13046]: I0308 03:35:23.911067 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:23.971621 master-0 kubenswrapper[13046]: I0308 03:35:23.969617 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" podStartSLOduration=22.907299125 podStartE2EDuration="27.969598165s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:35:18.447858911 +0000 UTC m=+1320.526626128" lastFinishedPulling="2026-03-08 03:35:23.510157941 +0000 UTC m=+1325.588925168" observedRunningTime="2026-03-08 03:35:23.937586828 +0000 UTC m=+1326.016354045" watchObservedRunningTime="2026-03-08 03:35:23.969598165 +0000 UTC m=+1326.048365382" Mar 08 03:35:25.934002 master-0 kubenswrapper[13046]: I0308 03:35:25.933899 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" event={"ID":"17f3dac6-f1b5-4396-abe1-c2ef1e3a321e","Type":"ContainerStarted","Data":"d0a24bc69eaf2f761a6d052156568862e559e39f17be94a14159078ea984a3c4"} Mar 08 03:35:25.934568 master-0 kubenswrapper[13046]: I0308 03:35:25.934113 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:25.970535 master-0 kubenswrapper[13046]: I0308 03:35:25.969887 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" podStartSLOduration=24.25508954 podStartE2EDuration="29.969859595s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="2026-03-08 03:35:19.634126298 +0000 UTC m=+1321.712893515" lastFinishedPulling="2026-03-08 03:35:25.348896313 +0000 UTC m=+1327.427663570" observedRunningTime="2026-03-08 03:35:25.951894316 +0000 UTC m=+1328.030661573" watchObservedRunningTime="2026-03-08 03:35:25.969859595 +0000 UTC m=+1328.048626852" Mar 08 03:35:26.665774 master-0 kubenswrapper[13046]: I0308 03:35:26.665678 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-dqzpj" Mar 08 03:35:26.689596 master-0 kubenswrapper[13046]: I0308 03:35:26.689540 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-lgvbm" Mar 08 03:35:26.794510 master-0 kubenswrapper[13046]: I0308 03:35:26.793709 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-kjfz5" Mar 08 03:35:26.856507 master-0 kubenswrapper[13046]: I0308 03:35:26.852833 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-h8sbr" Mar 08 03:35:26.912840 master-0 kubenswrapper[13046]: I0308 03:35:26.912456 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-d6sb5" Mar 08 03:35:26.987618 master-0 kubenswrapper[13046]: I0308 03:35:26.986517 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-plz2g" Mar 08 03:35:27.136868 master-0 kubenswrapper[13046]: I0308 03:35:27.136811 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-v9kvs" Mar 08 03:35:27.180275 master-0 kubenswrapper[13046]: I0308 03:35:27.180224 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-v2kb9" Mar 08 03:35:27.265816 master-0 kubenswrapper[13046]: I0308 03:35:27.265704 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4c8zp" Mar 08 03:35:27.343280 master-0 kubenswrapper[13046]: I0308 03:35:27.343226 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ggv2h" Mar 08 03:35:27.383517 master-0 kubenswrapper[13046]: I0308 03:35:27.383442 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-xf2d6" Mar 08 03:35:27.428391 master-0 kubenswrapper[13046]: I0308 03:35:27.427980 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lsdhl" Mar 08 03:35:27.631752 master-0 kubenswrapper[13046]: I0308 03:35:27.630871 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-mxcfj" Mar 08 03:35:27.741908 master-0 kubenswrapper[13046]: I0308 03:35:27.741851 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-dbrs9" Mar 08 03:35:27.757442 master-0 kubenswrapper[13046]: I0308 03:35:27.757392 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-qsh7t" Mar 08 03:35:27.794340 master-0 kubenswrapper[13046]: I0308 03:35:27.794295 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-ckkmf" Mar 08 03:35:27.983401 master-0 kubenswrapper[13046]: I0308 03:35:27.983284 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-fcgrv" Mar 08 03:35:27.991081 master-0 kubenswrapper[13046]: I0308 03:35:27.991048 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-lt4h9" Mar 08 03:35:28.007094 master-0 kubenswrapper[13046]: I0308 03:35:28.007039 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lv2t9" Mar 08 03:35:29.717102 master-0 kubenswrapper[13046]: I0308 03:35:29.717044 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:29.725589 master-0 kubenswrapper[13046]: I0308 03:35:29.722156 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f77296db-7b56-4306-bcf1-d2cef736f49f-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-m2zkw\" (UID: \"f77296db-7b56-4306-bcf1-d2cef736f49f\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:29.813239 master-0 kubenswrapper[13046]: I0308 03:35:29.813171 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:30.298160 master-0 kubenswrapper[13046]: I0308 03:35:30.298078 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw"] Mar 08 03:35:30.988877 master-0 kubenswrapper[13046]: I0308 03:35:30.988813 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" event={"ID":"f77296db-7b56-4306-bcf1-d2cef736f49f","Type":"ContainerStarted","Data":"db487e93fb834e9260ddc288f21e1eaaf4727c2d3c25f248a4f320d163f0516f"} Mar 08 03:35:30.988877 master-0 kubenswrapper[13046]: I0308 03:35:30.988870 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" event={"ID":"f77296db-7b56-4306-bcf1-d2cef736f49f","Type":"ContainerStarted","Data":"692f6a3812afb22ab1ea0308ccabfa532965d93aedb5e60b943a1164b8b0749e"} Mar 08 03:35:30.990055 master-0 kubenswrapper[13046]: I0308 03:35:30.990014 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:35:32.700030 master-0 kubenswrapper[13046]: I0308 03:35:32.699942 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-99265" Mar 08 03:35:32.740109 master-0 kubenswrapper[13046]: I0308 03:35:32.739954 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" podStartSLOduration=36.739929145 podStartE2EDuration="36.739929145s" podCreationTimestamp="2026-03-08 03:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:35:31.018621332 +0000 UTC m=+1333.097388559" watchObservedRunningTime="2026-03-08 03:35:32.739929145 +0000 UTC m=+1334.818696372" Mar 08 03:35:33.303361 master-0 kubenswrapper[13046]: I0308 03:35:33.303286 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-64kb6" Mar 08 03:35:39.821825 master-0 kubenswrapper[13046]: I0308 03:35:39.821753 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-m2zkw" Mar 08 03:36:23.152318 master-0 kubenswrapper[13046]: I0308 03:36:23.152207 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:23.163546 master-0 kubenswrapper[13046]: I0308 03:36:23.163354 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.168216 master-0 kubenswrapper[13046]: I0308 03:36:23.165627 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 03:36:23.168216 master-0 kubenswrapper[13046]: I0308 03:36:23.165800 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 03:36:23.168216 master-0 kubenswrapper[13046]: I0308 03:36:23.165833 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 03:36:23.173648 master-0 kubenswrapper[13046]: I0308 03:36:23.169589 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:23.201398 master-0 kubenswrapper[13046]: I0308 03:36:23.198122 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.201398 master-0 kubenswrapper[13046]: I0308 03:36:23.198254 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg6xf\" (UniqueName: \"kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.275509 master-0 kubenswrapper[13046]: I0308 03:36:23.258509 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:23.275509 master-0 kubenswrapper[13046]: I0308 03:36:23.259966 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.291508 master-0 kubenswrapper[13046]: I0308 03:36:23.278151 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:23.311506 master-0 kubenswrapper[13046]: I0308 03:36:23.304988 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 03:36:23.311506 master-0 kubenswrapper[13046]: I0308 03:36:23.305993 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg6xf\" (UniqueName: \"kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.311506 master-0 kubenswrapper[13046]: I0308 03:36:23.306121 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.311506 master-0 kubenswrapper[13046]: I0308 03:36:23.306959 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.361780 master-0 kubenswrapper[13046]: I0308 03:36:23.361735 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg6xf\" (UniqueName: \"kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf\") pod \"dnsmasq-dns-69fd45f56f-pqtmh\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.407627 master-0 kubenswrapper[13046]: I0308 03:36:23.407513 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.407861 master-0 kubenswrapper[13046]: I0308 03:36:23.407840 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.407961 master-0 kubenswrapper[13046]: I0308 03:36:23.407944 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkztg\" (UniqueName: \"kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.497645 master-0 kubenswrapper[13046]: I0308 03:36:23.497586 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:23.510210 master-0 kubenswrapper[13046]: I0308 03:36:23.510156 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.510311 master-0 kubenswrapper[13046]: I0308 03:36:23.510232 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.510389 master-0 kubenswrapper[13046]: I0308 03:36:23.510333 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkztg\" (UniqueName: \"kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.511162 master-0 kubenswrapper[13046]: I0308 03:36:23.511131 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.511775 master-0 kubenswrapper[13046]: I0308 03:36:23.511743 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.533285 master-0 kubenswrapper[13046]: I0308 03:36:23.533239 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkztg\" (UniqueName: \"kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg\") pod \"dnsmasq-dns-667b9d65dc-h56p6\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.699790 master-0 kubenswrapper[13046]: I0308 03:36:23.699667 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:23.945387 master-0 kubenswrapper[13046]: W0308 03:36:23.945337 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b627a32_e3c8_44aa_8488_90159021cbcf.slice/crio-8e358cdb275f97ddc48bd989569088e346d189386e1e0bb7f97483b4bd3e0f46 WatchSource:0}: Error finding container 8e358cdb275f97ddc48bd989569088e346d189386e1e0bb7f97483b4bd3e0f46: Status 404 returned error can't find the container with id 8e358cdb275f97ddc48bd989569088e346d189386e1e0bb7f97483b4bd3e0f46 Mar 08 03:36:23.946227 master-0 kubenswrapper[13046]: I0308 03:36:23.946120 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:23.947413 master-0 kubenswrapper[13046]: I0308 03:36:23.947386 13046 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 03:36:24.156846 master-0 kubenswrapper[13046]: I0308 03:36:24.156772 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:24.157437 master-0 kubenswrapper[13046]: W0308 03:36:24.156858 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4183465e_ec48_4ba6_9e3b_67270b1b2951.slice/crio-a21107483f496a0d2ed010e5d1a0317d2cd102bf3476d90672152e5d8839ead4 WatchSource:0}: Error finding container a21107483f496a0d2ed010e5d1a0317d2cd102bf3476d90672152e5d8839ead4: Status 404 returned error can't find the container with id a21107483f496a0d2ed010e5d1a0317d2cd102bf3476d90672152e5d8839ead4 Mar 08 03:36:24.683895 master-0 kubenswrapper[13046]: I0308 03:36:24.683833 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" event={"ID":"4183465e-ec48-4ba6-9e3b-67270b1b2951","Type":"ContainerStarted","Data":"a21107483f496a0d2ed010e5d1a0317d2cd102bf3476d90672152e5d8839ead4"} Mar 08 03:36:24.686154 master-0 kubenswrapper[13046]: I0308 03:36:24.686103 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" event={"ID":"4b627a32-e3c8-44aa-8488-90159021cbcf","Type":"ContainerStarted","Data":"8e358cdb275f97ddc48bd989569088e346d189386e1e0bb7f97483b4bd3e0f46"} Mar 08 03:36:25.859873 master-0 kubenswrapper[13046]: I0308 03:36:25.859535 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:25.893531 master-0 kubenswrapper[13046]: I0308 03:36:25.893427 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:36:25.896624 master-0 kubenswrapper[13046]: I0308 03:36:25.896575 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:25.911375 master-0 kubenswrapper[13046]: I0308 03:36:25.911320 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:36:25.977607 master-0 kubenswrapper[13046]: I0308 03:36:25.977108 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:25.977607 master-0 kubenswrapper[13046]: I0308 03:36:25.977274 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcbs\" (UniqueName: \"kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:25.977607 master-0 kubenswrapper[13046]: I0308 03:36:25.977344 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.091622 master-0 kubenswrapper[13046]: I0308 03:36:26.090099 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcbs\" (UniqueName: \"kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.091622 master-0 kubenswrapper[13046]: I0308 03:36:26.090792 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.091933 master-0 kubenswrapper[13046]: I0308 03:36:26.091895 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.095370 master-0 kubenswrapper[13046]: I0308 03:36:26.094323 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.095445 master-0 kubenswrapper[13046]: I0308 03:36:26.095301 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.129143 master-0 kubenswrapper[13046]: I0308 03:36:26.119851 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcbs\" (UniqueName: \"kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs\") pod \"dnsmasq-dns-7466868675-jbnsb\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.232377 master-0 kubenswrapper[13046]: I0308 03:36:26.232051 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:26.259543 master-0 kubenswrapper[13046]: I0308 03:36:26.256150 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:26.287093 master-0 kubenswrapper[13046]: I0308 03:36:26.285141 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:36:26.297250 master-0 kubenswrapper[13046]: I0308 03:36:26.296663 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:36:26.297250 master-0 kubenswrapper[13046]: I0308 03:36:26.296810 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.406396 master-0 kubenswrapper[13046]: I0308 03:36:26.406294 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.406396 master-0 kubenswrapper[13046]: I0308 03:36:26.406372 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.406612 master-0 kubenswrapper[13046]: I0308 03:36:26.406460 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8h7r\" (UniqueName: \"kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.515707 master-0 kubenswrapper[13046]: I0308 03:36:26.514056 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.515707 master-0 kubenswrapper[13046]: I0308 03:36:26.514765 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.515707 master-0 kubenswrapper[13046]: I0308 03:36:26.514881 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.515707 master-0 kubenswrapper[13046]: I0308 03:36:26.515395 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8h7r\" (UniqueName: \"kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.516265 master-0 kubenswrapper[13046]: I0308 03:36:26.515914 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.542346 master-0 kubenswrapper[13046]: I0308 03:36:26.542294 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8h7r\" (UniqueName: \"kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r\") pod \"dnsmasq-dns-76ff7d945-k9nrf\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.668724 master-0 kubenswrapper[13046]: I0308 03:36:26.668527 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:26.896983 master-0 kubenswrapper[13046]: I0308 03:36:26.894541 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:36:27.183370 master-0 kubenswrapper[13046]: I0308 03:36:27.180799 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:36:27.801977 master-0 kubenswrapper[13046]: I0308 03:36:27.801868 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" event={"ID":"38c2671c-0337-4af7-8a29-eef713b62f67","Type":"ContainerStarted","Data":"3ff9e9fcfaa6cf7f7f6de49617e7efb4db57c141e9524bb4c2da72737b9efa08"} Mar 08 03:36:27.805706 master-0 kubenswrapper[13046]: I0308 03:36:27.805633 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerStarted","Data":"cb68db2c2d8e0b5b38c4c5a1bfbbf0bc6e4029863405919fcdbe3e5f9c98103f"} Mar 08 03:36:30.086275 master-0 kubenswrapper[13046]: I0308 03:36:30.084536 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 03:36:30.094852 master-0 kubenswrapper[13046]: I0308 03:36:30.088873 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.094852 master-0 kubenswrapper[13046]: I0308 03:36:30.093717 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 03:36:30.094852 master-0 kubenswrapper[13046]: I0308 03:36:30.094046 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 03:36:30.094852 master-0 kubenswrapper[13046]: I0308 03:36:30.094357 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 03:36:30.097743 master-0 kubenswrapper[13046]: I0308 03:36:30.096462 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 03:36:30.101325 master-0 kubenswrapper[13046]: I0308 03:36:30.100727 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 03:36:30.101804 master-0 kubenswrapper[13046]: I0308 03:36:30.101216 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 03:36:30.106377 master-0 kubenswrapper[13046]: I0308 03:36:30.106324 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287155 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287232 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287289 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287342 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745ms\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-kube-api-access-745ms\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287382 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287409 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-da1ac64e-f8ce-4780-80de-cfd22214afb5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2f6b616f-1770-40ee-829f-b83572602b80\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287428 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287454 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287502 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287519 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.290997 master-0 kubenswrapper[13046]: I0308 03:36:30.287546 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.389051 master-0 kubenswrapper[13046]: I0308 03:36:30.388976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.389450 master-0 kubenswrapper[13046]: I0308 03:36:30.389061 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745ms\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-kube-api-access-745ms\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.389450 master-0 kubenswrapper[13046]: I0308 03:36:30.389109 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.389450 master-0 kubenswrapper[13046]: I0308 03:36:30.389140 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-da1ac64e-f8ce-4780-80de-cfd22214afb5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2f6b616f-1770-40ee-829f-b83572602b80\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.395333 master-0 kubenswrapper[13046]: I0308 03:36:30.393860 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.395333 master-0 kubenswrapper[13046]: I0308 03:36:30.393929 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.395333 master-0 kubenswrapper[13046]: I0308 03:36:30.393995 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.395333 master-0 kubenswrapper[13046]: I0308 03:36:30.394023 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.403793 master-0 kubenswrapper[13046]: I0308 03:36:30.399955 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.403793 master-0 kubenswrapper[13046]: I0308 03:36:30.400107 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.403793 master-0 kubenswrapper[13046]: I0308 03:36:30.400185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.403793 master-0 kubenswrapper[13046]: I0308 03:36:30.402157 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.403793 master-0 kubenswrapper[13046]: I0308 03:36:30.402456 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.404307 master-0 kubenswrapper[13046]: I0308 03:36:30.404212 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.404824 master-0 kubenswrapper[13046]: I0308 03:36:30.404732 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.405569 master-0 kubenswrapper[13046]: I0308 03:36:30.405519 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.425747 master-0 kubenswrapper[13046]: I0308 03:36:30.409441 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.425747 master-0 kubenswrapper[13046]: I0308 03:36:30.409611 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:30.425747 master-0 kubenswrapper[13046]: I0308 03:36:30.409657 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-da1ac64e-f8ce-4780-80de-cfd22214afb5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2f6b616f-1770-40ee-829f-b83572602b80\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7a1e1d04c0d637e5ea5625298e47ddf2efc3513584a7a6388060daa2d2f08836/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.425747 master-0 kubenswrapper[13046]: I0308 03:36:30.418904 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.425747 master-0 kubenswrapper[13046]: I0308 03:36:30.421728 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.430202 master-0 kubenswrapper[13046]: I0308 03:36:30.430141 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.463421 master-0 kubenswrapper[13046]: I0308 03:36:30.463171 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745ms\" (UniqueName: \"kubernetes.io/projected/5d102cd0-1a7f-4196-883c-bf2fd94fc7f2-kube-api-access-745ms\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:30.840944 master-0 kubenswrapper[13046]: I0308 03:36:30.840912 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 03:36:30.842418 master-0 kubenswrapper[13046]: I0308 03:36:30.842399 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 03:36:30.858553 master-0 kubenswrapper[13046]: I0308 03:36:30.856168 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 03:36:30.858553 master-0 kubenswrapper[13046]: I0308 03:36:30.857880 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 03:36:30.867222 master-0 kubenswrapper[13046]: I0308 03:36:30.862057 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 03:36:30.868615 master-0 kubenswrapper[13046]: I0308 03:36:30.868507 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 03:36:31.041281 master-0 kubenswrapper[13046]: I0308 03:36:31.034716 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-kolla-config\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.041281 master-0 kubenswrapper[13046]: I0308 03:36:31.034777 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-config-data\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.054643 master-0 kubenswrapper[13046]: I0308 03:36:31.042165 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhl4t\" (UniqueName: \"kubernetes.io/projected/96bb7670-d973-44e2-b9f5-887303acf725-kube-api-access-zhl4t\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.054643 master-0 kubenswrapper[13046]: I0308 03:36:31.042278 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-combined-ca-bundle\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.054643 master-0 kubenswrapper[13046]: I0308 03:36:31.042385 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-memcached-tls-certs\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.148147 master-0 kubenswrapper[13046]: I0308 03:36:31.148000 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhl4t\" (UniqueName: \"kubernetes.io/projected/96bb7670-d973-44e2-b9f5-887303acf725-kube-api-access-zhl4t\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.148147 master-0 kubenswrapper[13046]: I0308 03:36:31.148102 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-combined-ca-bundle\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.148147 master-0 kubenswrapper[13046]: I0308 03:36:31.148148 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-memcached-tls-certs\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.148727 master-0 kubenswrapper[13046]: I0308 03:36:31.148276 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-kolla-config\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.148727 master-0 kubenswrapper[13046]: I0308 03:36:31.148307 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-config-data\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.149198 master-0 kubenswrapper[13046]: I0308 03:36:31.149144 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-config-data\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.156931 master-0 kubenswrapper[13046]: I0308 03:36:31.156882 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-memcached-tls-certs\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.159210 master-0 kubenswrapper[13046]: I0308 03:36:31.159089 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96bb7670-d973-44e2-b9f5-887303acf725-kolla-config\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.160910 master-0 kubenswrapper[13046]: I0308 03:36:31.160867 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96bb7670-d973-44e2-b9f5-887303acf725-combined-ca-bundle\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.201150 master-0 kubenswrapper[13046]: I0308 03:36:31.200729 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhl4t\" (UniqueName: \"kubernetes.io/projected/96bb7670-d973-44e2-b9f5-887303acf725-kube-api-access-zhl4t\") pod \"memcached-0\" (UID: \"96bb7670-d973-44e2-b9f5-887303acf725\") " pod="openstack/memcached-0" Mar 08 03:36:31.411467 master-0 kubenswrapper[13046]: I0308 03:36:31.411360 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 03:36:31.423611 master-0 kubenswrapper[13046]: I0308 03:36:31.423510 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 03:36:31.423909 master-0 kubenswrapper[13046]: I0308 03:36:31.423874 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.437044 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.437257 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.437356 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.437453 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.437582 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 03:36:31.443546 master-0 kubenswrapper[13046]: I0308 03:36:31.439947 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 03:36:31.488160 master-0 kubenswrapper[13046]: I0308 03:36:31.487327 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 03:36:31.578415 master-0 kubenswrapper[13046]: I0308 03:36:31.578335 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578415 master-0 kubenswrapper[13046]: I0308 03:36:31.578409 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9lz\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-kube-api-access-bf9lz\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578435 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578468 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578509 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578527 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1380aea-3a6c-42ed-be02-bcb76b9d9b2c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^772b5fa0-8ccd-4f71-b608-079ba8783009\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578550 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578571 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578627 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578648 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e48517-115a-43d0-ad79-a342efe0cf49-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.578817 master-0 kubenswrapper[13046]: I0308 03:36:31.578674 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e48517-115a-43d0-ad79-a342efe0cf49-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.679835 master-0 kubenswrapper[13046]: I0308 03:36:31.679739 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9lz\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-kube-api-access-bf9lz\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.679835 master-0 kubenswrapper[13046]: I0308 03:36:31.679786 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695322 master-0 kubenswrapper[13046]: I0308 03:36:31.695262 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695523 master-0 kubenswrapper[13046]: I0308 03:36:31.695352 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695523 master-0 kubenswrapper[13046]: I0308 03:36:31.695379 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1380aea-3a6c-42ed-be02-bcb76b9d9b2c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^772b5fa0-8ccd-4f71-b608-079ba8783009\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695523 master-0 kubenswrapper[13046]: I0308 03:36:31.695435 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695523 master-0 kubenswrapper[13046]: I0308 03:36:31.695505 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695645 master-0 kubenswrapper[13046]: I0308 03:36:31.695633 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695682 master-0 kubenswrapper[13046]: I0308 03:36:31.695661 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e48517-115a-43d0-ad79-a342efe0cf49-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695736 master-0 kubenswrapper[13046]: I0308 03:36:31.695718 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e48517-115a-43d0-ad79-a342efe0cf49-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.695884 master-0 kubenswrapper[13046]: I0308 03:36:31.690275 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.697509 master-0 kubenswrapper[13046]: I0308 03:36:31.696026 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.697509 master-0 kubenswrapper[13046]: I0308 03:36:31.696616 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.697509 master-0 kubenswrapper[13046]: I0308 03:36:31.696972 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.697509 master-0 kubenswrapper[13046]: I0308 03:36:31.697266 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.697509 master-0 kubenswrapper[13046]: I0308 03:36:31.697455 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e48517-115a-43d0-ad79-a342efe0cf49-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.698859 master-0 kubenswrapper[13046]: I0308 03:36:31.698799 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.698932 master-0 kubenswrapper[13046]: I0308 03:36:31.698896 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:31.698932 master-0 kubenswrapper[13046]: I0308 03:36:31.698920 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1380aea-3a6c-42ed-be02-bcb76b9d9b2c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^772b5fa0-8ccd-4f71-b608-079ba8783009\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/764ed58a80100a5745f62916e545415f19139d1ba8b2c41debb42252b9e82d17/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.702872 master-0 kubenswrapper[13046]: I0308 03:36:31.702823 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9lz\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-kube-api-access-bf9lz\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.705634 master-0 kubenswrapper[13046]: I0308 03:36:31.705564 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e48517-115a-43d0-ad79-a342efe0cf49-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.706274 master-0 kubenswrapper[13046]: I0308 03:36:31.706230 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e48517-115a-43d0-ad79-a342efe0cf49-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:31.708800 master-0 kubenswrapper[13046]: I0308 03:36:31.708731 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e48517-115a-43d0-ad79-a342efe0cf49-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:32.032237 master-0 kubenswrapper[13046]: I0308 03:36:32.032101 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-da1ac64e-f8ce-4780-80de-cfd22214afb5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2f6b616f-1770-40ee-829f-b83572602b80\") pod \"rabbitmq-cell1-server-0\" (UID: \"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:32.242061 master-0 kubenswrapper[13046]: I0308 03:36:32.241972 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:36:32.422362 master-0 kubenswrapper[13046]: I0308 03:36:32.422311 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 03:36:32.424185 master-0 kubenswrapper[13046]: I0308 03:36:32.424139 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 03:36:32.430471 master-0 kubenswrapper[13046]: I0308 03:36:32.430427 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 03:36:32.430946 master-0 kubenswrapper[13046]: I0308 03:36:32.430749 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 03:36:32.430946 master-0 kubenswrapper[13046]: I0308 03:36:32.430899 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 03:36:32.477577 master-0 kubenswrapper[13046]: I0308 03:36:32.477246 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626577 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626625 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626696 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626723 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626765 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626794 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-376a2c95-a5ab-41e4-8554-ea8047936cd0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^886cc3dd-ba2a-4537-aa27-ef27e69ceefc\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626815 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptrb\" (UniqueName: \"kubernetes.io/projected/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kube-api-access-2ptrb\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.627012 master-0 kubenswrapper[13046]: I0308 03:36:32.626834 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733651 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733713 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733767 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733799 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-376a2c95-a5ab-41e4-8554-ea8047936cd0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^886cc3dd-ba2a-4537-aa27-ef27e69ceefc\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733822 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptrb\" (UniqueName: \"kubernetes.io/projected/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kube-api-access-2ptrb\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733837 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733888 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.733906 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.735200 master-0 kubenswrapper[13046]: I0308 03:36:32.734790 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-default\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.737350 master-0 kubenswrapper[13046]: I0308 03:36:32.737036 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.744591 master-0 kubenswrapper[13046]: I0308 03:36:32.737772 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kolla-config\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.744591 master-0 kubenswrapper[13046]: I0308 03:36:32.740144 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:32.744591 master-0 kubenswrapper[13046]: I0308 03:36:32.740181 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-376a2c95-a5ab-41e4-8554-ea8047936cd0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^886cc3dd-ba2a-4537-aa27-ef27e69ceefc\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4271af45a473841d4512996ee515cab87056450e15d6969854493ee0357c1f20/globalmount\"" pod="openstack/openstack-galera-0" Mar 08 03:36:32.748367 master-0 kubenswrapper[13046]: I0308 03:36:32.748303 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.754774 master-0 kubenswrapper[13046]: I0308 03:36:32.748813 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.754774 master-0 kubenswrapper[13046]: I0308 03:36:32.750070 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:32.755721 master-0 kubenswrapper[13046]: I0308 03:36:32.755028 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptrb\" (UniqueName: \"kubernetes.io/projected/aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd-kube-api-access-2ptrb\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:33.396082 master-0 kubenswrapper[13046]: I0308 03:36:33.396019 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 03:36:33.398049 master-0 kubenswrapper[13046]: I0308 03:36:33.398013 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.400533 master-0 kubenswrapper[13046]: I0308 03:36:33.400468 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 03:36:33.401013 master-0 kubenswrapper[13046]: I0308 03:36:33.400992 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 03:36:33.401262 master-0 kubenswrapper[13046]: I0308 03:36:33.401236 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 03:36:33.407313 master-0 kubenswrapper[13046]: I0308 03:36:33.407264 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 03:36:33.467075 master-0 kubenswrapper[13046]: I0308 03:36:33.466004 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1380aea-3a6c-42ed-be02-bcb76b9d9b2c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^772b5fa0-8ccd-4f71-b608-079ba8783009\") pod \"rabbitmq-server-0\" (UID: \"53e48517-115a-43d0-ad79-a342efe0cf49\") " pod="openstack/rabbitmq-server-0" Mar 08 03:36:33.566472 master-0 kubenswrapper[13046]: I0308 03:36:33.566352 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566472 master-0 kubenswrapper[13046]: I0308 03:36:33.566428 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566472 master-0 kubenswrapper[13046]: I0308 03:36:33.566449 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566732 master-0 kubenswrapper[13046]: I0308 03:36:33.566523 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fdd8c9a-fd76-4bc5-9a09-ab46e02b1cb9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^85566f5c-9637-440e-b37f-fb5160a640af\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566732 master-0 kubenswrapper[13046]: I0308 03:36:33.566574 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566732 master-0 kubenswrapper[13046]: I0308 03:36:33.566596 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566732 master-0 kubenswrapper[13046]: I0308 03:36:33.566624 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.566732 master-0 kubenswrapper[13046]: I0308 03:36:33.566655 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lwwv\" (UniqueName: \"kubernetes.io/projected/1098c02e-9145-47f5-b794-cdc3f015a7b5-kube-api-access-7lwwv\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668676 master-0 kubenswrapper[13046]: I0308 03:36:33.668520 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lwwv\" (UniqueName: \"kubernetes.io/projected/1098c02e-9145-47f5-b794-cdc3f015a7b5-kube-api-access-7lwwv\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668676 master-0 kubenswrapper[13046]: I0308 03:36:33.668676 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668925 master-0 kubenswrapper[13046]: I0308 03:36:33.668718 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668925 master-0 kubenswrapper[13046]: I0308 03:36:33.668746 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668925 master-0 kubenswrapper[13046]: I0308 03:36:33.668802 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fdd8c9a-fd76-4bc5-9a09-ab46e02b1cb9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^85566f5c-9637-440e-b37f-fb5160a640af\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668925 master-0 kubenswrapper[13046]: I0308 03:36:33.668864 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.668925 master-0 kubenswrapper[13046]: I0308 03:36:33.668894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.669092 master-0 kubenswrapper[13046]: I0308 03:36:33.668932 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.671299 master-0 kubenswrapper[13046]: I0308 03:36:33.671126 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.671403 master-0 kubenswrapper[13046]: I0308 03:36:33.671309 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.674355 master-0 kubenswrapper[13046]: I0308 03:36:33.673325 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.674355 master-0 kubenswrapper[13046]: I0308 03:36:33.673590 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:33.674355 master-0 kubenswrapper[13046]: I0308 03:36:33.673626 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fdd8c9a-fd76-4bc5-9a09-ab46e02b1cb9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^85566f5c-9637-440e-b37f-fb5160a640af\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1b35fcfa5bff17e8c6a5412e8a7c03680a72d78cf159d23b5f6f2446099a8792/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.675329 master-0 kubenswrapper[13046]: I0308 03:36:33.674763 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1098c02e-9145-47f5-b794-cdc3f015a7b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.675477 master-0 kubenswrapper[13046]: I0308 03:36:33.675424 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.681079 master-0 kubenswrapper[13046]: I0308 03:36:33.681015 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1098c02e-9145-47f5-b794-cdc3f015a7b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:33.691230 master-0 kubenswrapper[13046]: I0308 03:36:33.691124 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 03:36:33.696184 master-0 kubenswrapper[13046]: I0308 03:36:33.696086 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lwwv\" (UniqueName: \"kubernetes.io/projected/1098c02e-9145-47f5-b794-cdc3f015a7b5-kube-api-access-7lwwv\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:34.505254 master-0 kubenswrapper[13046]: I0308 03:36:34.505196 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-376a2c95-a5ab-41e4-8554-ea8047936cd0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^886cc3dd-ba2a-4537-aa27-ef27e69ceefc\") pod \"openstack-galera-0\" (UID: \"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd\") " pod="openstack/openstack-galera-0" Mar 08 03:36:34.859390 master-0 kubenswrapper[13046]: I0308 03:36:34.859322 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 03:36:35.572999 master-0 kubenswrapper[13046]: I0308 03:36:35.572939 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fdd8c9a-fd76-4bc5-9a09-ab46e02b1cb9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^85566f5c-9637-440e-b37f-fb5160a640af\") pod \"openstack-cell1-galera-0\" (UID: \"1098c02e-9145-47f5-b794-cdc3f015a7b5\") " pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:35.873214 master-0 kubenswrapper[13046]: I0308 03:36:35.873145 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 03:36:37.037863 master-0 kubenswrapper[13046]: I0308 03:36:37.037825 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 03:36:37.194393 master-0 kubenswrapper[13046]: I0308 03:36:37.188979 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wzhj"] Mar 08 03:36:37.213542 master-0 kubenswrapper[13046]: I0308 03:36:37.202656 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.224769 master-0 kubenswrapper[13046]: I0308 03:36:37.224338 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 03:36:37.225071 master-0 kubenswrapper[13046]: I0308 03:36:37.225029 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 03:36:37.244557 master-0 kubenswrapper[13046]: I0308 03:36:37.232674 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5d2gw"] Mar 08 03:36:37.244557 master-0 kubenswrapper[13046]: I0308 03:36:37.236415 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.260378 master-0 kubenswrapper[13046]: I0308 03:36:37.259602 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wzhj"] Mar 08 03:36:37.271837 master-0 kubenswrapper[13046]: I0308 03:36:37.271773 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-etc-ovs\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.271948 master-0 kubenswrapper[13046]: I0308 03:36:37.271844 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-lib\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.271948 master-0 kubenswrapper[13046]: I0308 03:36:37.271865 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-scripts\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.271948 master-0 kubenswrapper[13046]: I0308 03:36:37.271909 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-run\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.271948 master-0 kubenswrapper[13046]: I0308 03:36:37.271932 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-log\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.272406 master-0 kubenswrapper[13046]: I0308 03:36:37.272320 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-ovn-controller-tls-certs\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.272406 master-0 kubenswrapper[13046]: I0308 03:36:37.272373 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-combined-ca-bundle\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.272406 master-0 kubenswrapper[13046]: I0308 03:36:37.272398 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvft\" (UniqueName: \"kubernetes.io/projected/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-kube-api-access-jsvft\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.272539 master-0 kubenswrapper[13046]: I0308 03:36:37.272469 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-log-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.272539 master-0 kubenswrapper[13046]: I0308 03:36:37.272504 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b658945-9aef-47dc-8600-eb30f696cc3b-scripts\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.272600 master-0 kubenswrapper[13046]: I0308 03:36:37.272554 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.272600 master-0 kubenswrapper[13046]: I0308 03:36:37.272576 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4fr\" (UniqueName: \"kubernetes.io/projected/5b658945-9aef-47dc-8600-eb30f696cc3b-kube-api-access-hl4fr\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.272660 master-0 kubenswrapper[13046]: I0308 03:36:37.272602 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.293548 master-0 kubenswrapper[13046]: I0308 03:36:37.293382 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5d2gw"] Mar 08 03:36:37.375820 master-0 kubenswrapper[13046]: I0308 03:36:37.375749 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-combined-ca-bundle\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.375820 master-0 kubenswrapper[13046]: I0308 03:36:37.375820 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvft\" (UniqueName: \"kubernetes.io/projected/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-kube-api-access-jsvft\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.375894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-log-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.375917 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b658945-9aef-47dc-8600-eb30f696cc3b-scripts\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.375961 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.375979 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4fr\" (UniqueName: \"kubernetes.io/projected/5b658945-9aef-47dc-8600-eb30f696cc3b-kube-api-access-hl4fr\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.376001 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.376053 master-0 kubenswrapper[13046]: I0308 03:36:37.376037 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-etc-ovs\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376223 master-0 kubenswrapper[13046]: I0308 03:36:37.376059 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-lib\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376223 master-0 kubenswrapper[13046]: I0308 03:36:37.376074 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-scripts\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.376223 master-0 kubenswrapper[13046]: I0308 03:36:37.376105 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-run\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376223 master-0 kubenswrapper[13046]: I0308 03:36:37.376121 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-log\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.376223 master-0 kubenswrapper[13046]: I0308 03:36:37.376149 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-ovn-controller-tls-certs\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.377703 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-log-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.377857 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-lib\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.377978 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.378058 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-run\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.378089 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-etc-ovs\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.378188 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-var-run-ovn\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.379939 master-0 kubenswrapper[13046]: I0308 03:36:37.378817 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b658945-9aef-47dc-8600-eb30f696cc3b-scripts\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.380281 master-0 kubenswrapper[13046]: I0308 03:36:37.380118 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-scripts\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.380281 master-0 kubenswrapper[13046]: I0308 03:36:37.380215 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b658945-9aef-47dc-8600-eb30f696cc3b-var-log\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.384348 master-0 kubenswrapper[13046]: I0308 03:36:37.383849 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-ovn-controller-tls-certs\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.396574 master-0 kubenswrapper[13046]: I0308 03:36:37.385705 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-combined-ca-bundle\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.400250 master-0 kubenswrapper[13046]: I0308 03:36:37.400208 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvft\" (UniqueName: \"kubernetes.io/projected/fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c-kube-api-access-jsvft\") pod \"ovn-controller-4wzhj\" (UID: \"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c\") " pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.401319 master-0 kubenswrapper[13046]: I0308 03:36:37.400679 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4fr\" (UniqueName: \"kubernetes.io/projected/5b658945-9aef-47dc-8600-eb30f696cc3b-kube-api-access-hl4fr\") pod \"ovn-controller-ovs-5d2gw\" (UID: \"5b658945-9aef-47dc-8600-eb30f696cc3b\") " pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:37.604065 master-0 kubenswrapper[13046]: I0308 03:36:37.604011 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:37.617568 master-0 kubenswrapper[13046]: I0308 03:36:37.617358 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:38.802174 master-0 kubenswrapper[13046]: I0308 03:36:38.802078 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 03:36:38.808291 master-0 kubenswrapper[13046]: I0308 03:36:38.805616 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.808950 master-0 kubenswrapper[13046]: I0308 03:36:38.808918 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 03:36:38.809262 master-0 kubenswrapper[13046]: I0308 03:36:38.809247 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 03:36:38.809457 master-0 kubenswrapper[13046]: I0308 03:36:38.809443 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 03:36:38.809673 master-0 kubenswrapper[13046]: I0308 03:36:38.809659 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 03:36:38.821216 master-0 kubenswrapper[13046]: I0308 03:36:38.814294 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 03:36:38.917081 master-0 kubenswrapper[13046]: I0308 03:36:38.917015 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917081 master-0 kubenswrapper[13046]: I0308 03:36:38.917079 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917361 master-0 kubenswrapper[13046]: I0308 03:36:38.917113 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-config\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917361 master-0 kubenswrapper[13046]: I0308 03:36:38.917140 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbrw\" (UniqueName: \"kubernetes.io/projected/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-kube-api-access-zwbrw\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917361 master-0 kubenswrapper[13046]: I0308 03:36:38.917170 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917361 master-0 kubenswrapper[13046]: I0308 03:36:38.917311 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.917361 master-0 kubenswrapper[13046]: I0308 03:36:38.917350 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38c0cf26-59e9-4245-b9e7-8cf837d9272d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f1ede398-56e3-417f-8870-08d90cb57d2a\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:38.918720 master-0 kubenswrapper[13046]: I0308 03:36:38.917595 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019526 master-0 kubenswrapper[13046]: I0308 03:36:39.019446 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019526 master-0 kubenswrapper[13046]: I0308 03:36:39.019531 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38c0cf26-59e9-4245-b9e7-8cf837d9272d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f1ede398-56e3-417f-8870-08d90cb57d2a\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019783 master-0 kubenswrapper[13046]: I0308 03:36:39.019637 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019783 master-0 kubenswrapper[13046]: I0308 03:36:39.019709 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019783 master-0 kubenswrapper[13046]: I0308 03:36:39.019730 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.019783 master-0 kubenswrapper[13046]: I0308 03:36:39.019752 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-config\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.021123 master-0 kubenswrapper[13046]: I0308 03:36:39.020462 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbrw\" (UniqueName: \"kubernetes.io/projected/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-kube-api-access-zwbrw\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.021123 master-0 kubenswrapper[13046]: I0308 03:36:39.020562 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.021123 master-0 kubenswrapper[13046]: I0308 03:36:39.020972 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-config\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.021249 master-0 kubenswrapper[13046]: I0308 03:36:39.021235 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.023243 master-0 kubenswrapper[13046]: I0308 03:36:39.023223 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:39.023315 master-0 kubenswrapper[13046]: I0308 03:36:39.023258 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38c0cf26-59e9-4245-b9e7-8cf837d9272d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f1ede398-56e3-417f-8870-08d90cb57d2a\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c220bd7104e8adc337791b9ee9c9ad71f7d5eb0d976df124002ee99b4023c05b/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.025560 master-0 kubenswrapper[13046]: I0308 03:36:39.023621 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.025560 master-0 kubenswrapper[13046]: I0308 03:36:39.025281 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.027010 master-0 kubenswrapper[13046]: I0308 03:36:39.026582 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.027010 master-0 kubenswrapper[13046]: I0308 03:36:39.026967 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.040513 master-0 kubenswrapper[13046]: I0308 03:36:39.038344 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbrw\" (UniqueName: \"kubernetes.io/projected/3533a834-99ca-4bb9-bc59-2c8eeb11a85e-kube-api-access-zwbrw\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:39.895284 master-0 kubenswrapper[13046]: W0308 03:36:39.895198 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d102cd0_1a7f_4196_883c_bf2fd94fc7f2.slice/crio-5663ff55ce5f516710da6d9b9aa5b68caa750dd9d2c3f8bb77d802a387b2f35c WatchSource:0}: Error finding container 5663ff55ce5f516710da6d9b9aa5b68caa750dd9d2c3f8bb77d802a387b2f35c: Status 404 returned error can't find the container with id 5663ff55ce5f516710da6d9b9aa5b68caa750dd9d2c3f8bb77d802a387b2f35c Mar 08 03:36:40.104596 master-0 kubenswrapper[13046]: I0308 03:36:40.104505 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2","Type":"ContainerStarted","Data":"5663ff55ce5f516710da6d9b9aa5b68caa750dd9d2c3f8bb77d802a387b2f35c"} Mar 08 03:36:40.448994 master-0 kubenswrapper[13046]: I0308 03:36:40.448886 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38c0cf26-59e9-4245-b9e7-8cf837d9272d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f1ede398-56e3-417f-8870-08d90cb57d2a\") pod \"ovsdbserver-nb-0\" (UID: \"3533a834-99ca-4bb9-bc59-2c8eeb11a85e\") " pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:40.649518 master-0 kubenswrapper[13046]: I0308 03:36:40.649469 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 03:36:41.166522 master-0 kubenswrapper[13046]: I0308 03:36:41.166023 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 03:36:41.172152 master-0 kubenswrapper[13046]: I0308 03:36:41.169514 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.172996 master-0 kubenswrapper[13046]: I0308 03:36:41.172504 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 03:36:41.172996 master-0 kubenswrapper[13046]: I0308 03:36:41.172523 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 03:36:41.178896 master-0 kubenswrapper[13046]: I0308 03:36:41.175308 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 03:36:41.179250 master-0 kubenswrapper[13046]: I0308 03:36:41.175465 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 03:36:41.274860 master-0 kubenswrapper[13046]: I0308 03:36:41.274439 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275145 master-0 kubenswrapper[13046]: I0308 03:36:41.274926 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275145 master-0 kubenswrapper[13046]: I0308 03:36:41.275015 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww45r\" (UniqueName: \"kubernetes.io/projected/c186da86-9a0c-48e2-a06a-babcc5d9e02c-kube-api-access-ww45r\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275592 master-0 kubenswrapper[13046]: I0308 03:36:41.275557 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d569801c-2281-403c-a419-aa8e170cd0c8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1045f44b-3572-4c1d-94ba-1ee1d17ce744\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275651 master-0 kubenswrapper[13046]: I0308 03:36:41.275634 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275732 master-0 kubenswrapper[13046]: I0308 03:36:41.275704 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275825 master-0 kubenswrapper[13046]: I0308 03:36:41.275791 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.275860 master-0 kubenswrapper[13046]: I0308 03:36:41.275838 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.377821 master-0 kubenswrapper[13046]: I0308 03:36:41.377736 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d569801c-2281-403c-a419-aa8e170cd0c8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1045f44b-3572-4c1d-94ba-1ee1d17ce744\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378108 master-0 kubenswrapper[13046]: I0308 03:36:41.377883 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378561 master-0 kubenswrapper[13046]: I0308 03:36:41.378525 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378677 master-0 kubenswrapper[13046]: I0308 03:36:41.378628 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378677 master-0 kubenswrapper[13046]: I0308 03:36:41.378662 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378983 master-0 kubenswrapper[13046]: I0308 03:36:41.378684 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378983 master-0 kubenswrapper[13046]: I0308 03:36:41.378754 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378983 master-0 kubenswrapper[13046]: I0308 03:36:41.378815 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.378983 master-0 kubenswrapper[13046]: I0308 03:36:41.378879 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww45r\" (UniqueName: \"kubernetes.io/projected/c186da86-9a0c-48e2-a06a-babcc5d9e02c-kube-api-access-ww45r\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.380117 master-0 kubenswrapper[13046]: I0308 03:36:41.379811 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:36:41.380117 master-0 kubenswrapper[13046]: I0308 03:36:41.379868 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d569801c-2281-403c-a419-aa8e170cd0c8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1045f44b-3572-4c1d-94ba-1ee1d17ce744\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/55097acbdef43a4dac4942e842bef7697eef55c699a548a018163e00ed7a4ace/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.380829 master-0 kubenswrapper[13046]: I0308 03:36:41.380777 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.381174 master-0 kubenswrapper[13046]: I0308 03:36:41.380944 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c186da86-9a0c-48e2-a06a-babcc5d9e02c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.386694 master-0 kubenswrapper[13046]: I0308 03:36:41.386619 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.386888 master-0 kubenswrapper[13046]: I0308 03:36:41.386848 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.405477 master-0 kubenswrapper[13046]: I0308 03:36:41.401681 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c186da86-9a0c-48e2-a06a-babcc5d9e02c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:41.406571 master-0 kubenswrapper[13046]: I0308 03:36:41.405916 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww45r\" (UniqueName: \"kubernetes.io/projected/c186da86-9a0c-48e2-a06a-babcc5d9e02c-kube-api-access-ww45r\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:42.814246 master-0 kubenswrapper[13046]: I0308 03:36:42.814174 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d569801c-2281-403c-a419-aa8e170cd0c8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1045f44b-3572-4c1d-94ba-1ee1d17ce744\") pod \"ovsdbserver-sb-0\" (UID: \"c186da86-9a0c-48e2-a06a-babcc5d9e02c\") " pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:42.997511 master-0 kubenswrapper[13046]: I0308 03:36:42.997332 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 03:36:45.037090 master-0 kubenswrapper[13046]: I0308 03:36:45.037031 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 03:36:45.442752 master-0 kubenswrapper[13046]: W0308 03:36:45.442409 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e48517_115a_43d0_ad79_a342efe0cf49.slice/crio-2b738ed74060aa4d73b66aea5df7716b34405f12824249684f55410fbabf5b6b WatchSource:0}: Error finding container 2b738ed74060aa4d73b66aea5df7716b34405f12824249684f55410fbabf5b6b: Status 404 returned error can't find the container with id 2b738ed74060aa4d73b66aea5df7716b34405f12824249684f55410fbabf5b6b Mar 08 03:36:45.985783 master-0 kubenswrapper[13046]: I0308 03:36:45.979294 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 03:36:46.071120 master-0 kubenswrapper[13046]: I0308 03:36:46.071059 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 03:36:46.177755 master-0 kubenswrapper[13046]: I0308 03:36:46.177533 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e48517-115a-43d0-ad79-a342efe0cf49","Type":"ContainerStarted","Data":"2b738ed74060aa4d73b66aea5df7716b34405f12824249684f55410fbabf5b6b"} Mar 08 03:36:46.179936 master-0 kubenswrapper[13046]: I0308 03:36:46.179881 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerStarted","Data":"1e6b42ee79a24892490cfad73780ec5a80e601c817d29dc2bfa94408478d33eb"} Mar 08 03:36:46.181883 master-0 kubenswrapper[13046]: I0308 03:36:46.181647 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" event={"ID":"4b627a32-e3c8-44aa-8488-90159021cbcf","Type":"ContainerStarted","Data":"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37"} Mar 08 03:36:46.181883 master-0 kubenswrapper[13046]: I0308 03:36:46.181721 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" podUID="4b627a32-e3c8-44aa-8488-90159021cbcf" containerName="init" containerID="cri-o://0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37" gracePeriod=10 Mar 08 03:36:46.184448 master-0 kubenswrapper[13046]: I0308 03:36:46.184374 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" event={"ID":"4183465e-ec48-4ba6-9e3b-67270b1b2951","Type":"ContainerStarted","Data":"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd"} Mar 08 03:36:46.184589 master-0 kubenswrapper[13046]: I0308 03:36:46.184542 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" podUID="4183465e-ec48-4ba6-9e3b-67270b1b2951" containerName="init" containerID="cri-o://81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd" gracePeriod=10 Mar 08 03:36:46.269354 master-0 kubenswrapper[13046]: I0308 03:36:46.269304 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 03:36:46.646265 master-0 kubenswrapper[13046]: I0308 03:36:46.643749 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wzhj"] Mar 08 03:36:46.762777 master-0 kubenswrapper[13046]: I0308 03:36:46.762727 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 03:36:47.216248 master-0 kubenswrapper[13046]: W0308 03:36:47.216134 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1098c02e_9145_47f5_b794_cdc3f015a7b5.slice/crio-306213b915d1fd630206134947e46bf3a6a658f0336ff0998f2c3a88e6cdbef2 WatchSource:0}: Error finding container 306213b915d1fd630206134947e46bf3a6a658f0336ff0998f2c3a88e6cdbef2: Status 404 returned error can't find the container with id 306213b915d1fd630206134947e46bf3a6a658f0336ff0998f2c3a88e6cdbef2 Mar 08 03:36:47.239718 master-0 kubenswrapper[13046]: W0308 03:36:47.239620 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1b89c1_2d73_4551_81ea_a2ef4dd88b5c.slice/crio-3cdd919ff6636441c181eb2a4f7fa87edff74bb6eb3d863aa2a87eba0d24559d WatchSource:0}: Error finding container 3cdd919ff6636441c181eb2a4f7fa87edff74bb6eb3d863aa2a87eba0d24559d: Status 404 returned error can't find the container with id 3cdd919ff6636441c181eb2a4f7fa87edff74bb6eb3d863aa2a87eba0d24559d Mar 08 03:36:47.242106 master-0 kubenswrapper[13046]: W0308 03:36:47.241307 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3533a834_99ca_4bb9_bc59_2c8eeb11a85e.slice/crio-052f98a70919ad0de3d0280fff16a8518d06dc3201caaad6d706a2de0949f987 WatchSource:0}: Error finding container 052f98a70919ad0de3d0280fff16a8518d06dc3201caaad6d706a2de0949f987: Status 404 returned error can't find the container with id 052f98a70919ad0de3d0280fff16a8518d06dc3201caaad6d706a2de0949f987 Mar 08 03:36:47.242754 master-0 kubenswrapper[13046]: W0308 03:36:47.242601 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeffd4c4_56b4_41c6_b701_0b0ad0ff63bd.slice/crio-f07460bba296308d9c641fffc1d42ed501b99efa1658f57fb88ff682da8f7ab9 WatchSource:0}: Error finding container f07460bba296308d9c641fffc1d42ed501b99efa1658f57fb88ff682da8f7ab9: Status 404 returned error can't find the container with id f07460bba296308d9c641fffc1d42ed501b99efa1658f57fb88ff682da8f7ab9 Mar 08 03:36:47.246971 master-0 kubenswrapper[13046]: W0308 03:36:47.246932 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96bb7670_d973_44e2_b9f5_887303acf725.slice/crio-3d8512bad07ecc1a6cf81c4b29a537d3f0924afb8f1b62366a2c8eec821400ab WatchSource:0}: Error finding container 3d8512bad07ecc1a6cf81c4b29a537d3f0924afb8f1b62366a2c8eec821400ab: Status 404 returned error can't find the container with id 3d8512bad07ecc1a6cf81c4b29a537d3f0924afb8f1b62366a2c8eec821400ab Mar 08 03:36:47.679501 master-0 kubenswrapper[13046]: I0308 03:36:47.678989 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 03:36:47.932735 master-0 kubenswrapper[13046]: I0308 03:36:47.932670 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5d2gw"] Mar 08 03:36:48.046166 master-0 kubenswrapper[13046]: W0308 03:36:48.046031 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b658945_9aef_47dc_8600_eb30f696cc3b.slice/crio-0d745b5697c90d75e05cdbe3f859d0d855a86de443893a55b97dbc94edfa600d WatchSource:0}: Error finding container 0d745b5697c90d75e05cdbe3f859d0d855a86de443893a55b97dbc94edfa600d: Status 404 returned error can't find the container with id 0d745b5697c90d75e05cdbe3f859d0d855a86de443893a55b97dbc94edfa600d Mar 08 03:36:48.132134 master-0 kubenswrapper[13046]: I0308 03:36:48.132091 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:48.154278 master-0 kubenswrapper[13046]: I0308 03:36:48.154225 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:48.216757 master-0 kubenswrapper[13046]: I0308 03:36:48.216539 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5d2gw" event={"ID":"5b658945-9aef-47dc-8600-eb30f696cc3b","Type":"ContainerStarted","Data":"0d745b5697c90d75e05cdbe3f859d0d855a86de443893a55b97dbc94edfa600d"} Mar 08 03:36:48.218601 master-0 kubenswrapper[13046]: I0308 03:36:48.218573 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj" event={"ID":"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c","Type":"ContainerStarted","Data":"3cdd919ff6636441c181eb2a4f7fa87edff74bb6eb3d863aa2a87eba0d24559d"} Mar 08 03:36:48.220516 master-0 kubenswrapper[13046]: I0308 03:36:48.220466 13046 generic.go:334] "Generic (PLEG): container finished" podID="4183465e-ec48-4ba6-9e3b-67270b1b2951" containerID="81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd" exitCode=0 Mar 08 03:36:48.220644 master-0 kubenswrapper[13046]: I0308 03:36:48.220546 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" event={"ID":"4183465e-ec48-4ba6-9e3b-67270b1b2951","Type":"ContainerDied","Data":"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd"} Mar 08 03:36:48.220644 master-0 kubenswrapper[13046]: I0308 03:36:48.220564 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" event={"ID":"4183465e-ec48-4ba6-9e3b-67270b1b2951","Type":"ContainerDied","Data":"a21107483f496a0d2ed010e5d1a0317d2cd102bf3476d90672152e5d8839ead4"} Mar 08 03:36:48.220644 master-0 kubenswrapper[13046]: I0308 03:36:48.220582 13046 scope.go:117] "RemoveContainer" containerID="81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd" Mar 08 03:36:48.220977 master-0 kubenswrapper[13046]: I0308 03:36:48.220753 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-h56p6" Mar 08 03:36:48.226752 master-0 kubenswrapper[13046]: I0308 03:36:48.226713 13046 generic.go:334] "Generic (PLEG): container finished" podID="38c2671c-0337-4af7-8a29-eef713b62f67" containerID="d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11" exitCode=0 Mar 08 03:36:48.226856 master-0 kubenswrapper[13046]: I0308 03:36:48.226777 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" event={"ID":"38c2671c-0337-4af7-8a29-eef713b62f67","Type":"ContainerDied","Data":"d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11"} Mar 08 03:36:48.241050 master-0 kubenswrapper[13046]: I0308 03:36:48.239430 13046 generic.go:334] "Generic (PLEG): container finished" podID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerID="1e6b42ee79a24892490cfad73780ec5a80e601c817d29dc2bfa94408478d33eb" exitCode=0 Mar 08 03:36:48.241050 master-0 kubenswrapper[13046]: I0308 03:36:48.239433 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerDied","Data":"1e6b42ee79a24892490cfad73780ec5a80e601c817d29dc2bfa94408478d33eb"} Mar 08 03:36:48.241366 master-0 kubenswrapper[13046]: I0308 03:36:48.241274 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3533a834-99ca-4bb9-bc59-2c8eeb11a85e","Type":"ContainerStarted","Data":"052f98a70919ad0de3d0280fff16a8518d06dc3201caaad6d706a2de0949f987"} Mar 08 03:36:48.256723 master-0 kubenswrapper[13046]: I0308 03:36:48.256674 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc\") pod \"4183465e-ec48-4ba6-9e3b-67270b1b2951\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " Mar 08 03:36:48.257072 master-0 kubenswrapper[13046]: I0308 03:36:48.257050 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config\") pod \"4b627a32-e3c8-44aa-8488-90159021cbcf\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " Mar 08 03:36:48.257122 master-0 kubenswrapper[13046]: I0308 03:36:48.257107 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg6xf\" (UniqueName: \"kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf\") pod \"4b627a32-e3c8-44aa-8488-90159021cbcf\" (UID: \"4b627a32-e3c8-44aa-8488-90159021cbcf\") " Mar 08 03:36:48.257726 master-0 kubenswrapper[13046]: I0308 03:36:48.257658 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkztg\" (UniqueName: \"kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg\") pod \"4183465e-ec48-4ba6-9e3b-67270b1b2951\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " Mar 08 03:36:48.257821 master-0 kubenswrapper[13046]: I0308 03:36:48.257797 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config\") pod \"4183465e-ec48-4ba6-9e3b-67270b1b2951\" (UID: \"4183465e-ec48-4ba6-9e3b-67270b1b2951\") " Mar 08 03:36:48.258755 master-0 kubenswrapper[13046]: I0308 03:36:48.258711 13046 scope.go:117] "RemoveContainer" containerID="81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd" Mar 08 03:36:48.262382 master-0 kubenswrapper[13046]: I0308 03:36:48.261707 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf" (OuterVolumeSpecName: "kube-api-access-mg6xf") pod "4b627a32-e3c8-44aa-8488-90159021cbcf" (UID: "4b627a32-e3c8-44aa-8488-90159021cbcf"). InnerVolumeSpecName "kube-api-access-mg6xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:36:48.274820 master-0 kubenswrapper[13046]: I0308 03:36:48.266885 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"96bb7670-d973-44e2-b9f5-887303acf725","Type":"ContainerStarted","Data":"3d8512bad07ecc1a6cf81c4b29a537d3f0924afb8f1b62366a2c8eec821400ab"} Mar 08 03:36:48.274820 master-0 kubenswrapper[13046]: I0308 03:36:48.268981 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd","Type":"ContainerStarted","Data":"f07460bba296308d9c641fffc1d42ed501b99efa1658f57fb88ff682da8f7ab9"} Mar 08 03:36:48.280915 master-0 kubenswrapper[13046]: E0308 03:36:48.280453 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd\": container with ID starting with 81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd not found: ID does not exist" containerID="81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd" Mar 08 03:36:48.280915 master-0 kubenswrapper[13046]: I0308 03:36:48.280523 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd"} err="failed to get container status \"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd\": rpc error: code = NotFound desc = could not find container \"81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd\": container with ID starting with 81cb3d6e4f2fbdf9a582016d9c85a1fc64ef66e5f627413e2b6b53e13acb49bd not found: ID does not exist" Mar 08 03:36:48.282324 master-0 kubenswrapper[13046]: I0308 03:36:48.281695 13046 generic.go:334] "Generic (PLEG): container finished" podID="4b627a32-e3c8-44aa-8488-90159021cbcf" containerID="0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37" exitCode=0 Mar 08 03:36:48.282324 master-0 kubenswrapper[13046]: I0308 03:36:48.281745 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" event={"ID":"4b627a32-e3c8-44aa-8488-90159021cbcf","Type":"ContainerDied","Data":"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37"} Mar 08 03:36:48.282324 master-0 kubenswrapper[13046]: I0308 03:36:48.281805 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" Mar 08 03:36:48.282324 master-0 kubenswrapper[13046]: I0308 03:36:48.281873 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-pqtmh" event={"ID":"4b627a32-e3c8-44aa-8488-90159021cbcf","Type":"ContainerDied","Data":"8e358cdb275f97ddc48bd989569088e346d189386e1e0bb7f97483b4bd3e0f46"} Mar 08 03:36:48.282324 master-0 kubenswrapper[13046]: I0308 03:36:48.281933 13046 scope.go:117] "RemoveContainer" containerID="0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37" Mar 08 03:36:48.287326 master-0 kubenswrapper[13046]: I0308 03:36:48.287044 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config" (OuterVolumeSpecName: "config") pod "4b627a32-e3c8-44aa-8488-90159021cbcf" (UID: "4b627a32-e3c8-44aa-8488-90159021cbcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:36:48.287326 master-0 kubenswrapper[13046]: I0308 03:36:48.287220 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1098c02e-9145-47f5-b794-cdc3f015a7b5","Type":"ContainerStarted","Data":"306213b915d1fd630206134947e46bf3a6a658f0336ff0998f2c3a88e6cdbef2"} Mar 08 03:36:48.289055 master-0 kubenswrapper[13046]: I0308 03:36:48.289008 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c186da86-9a0c-48e2-a06a-babcc5d9e02c","Type":"ContainerStarted","Data":"207ea6a92e95328a56bbb326620173e47b7bab0a1407470c9b24f5d2bd6cc931"} Mar 08 03:36:48.341758 master-0 kubenswrapper[13046]: I0308 03:36:48.341709 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg" (OuterVolumeSpecName: "kube-api-access-mkztg") pod "4183465e-ec48-4ba6-9e3b-67270b1b2951" (UID: "4183465e-ec48-4ba6-9e3b-67270b1b2951"). InnerVolumeSpecName "kube-api-access-mkztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:36:48.363022 master-0 kubenswrapper[13046]: I0308 03:36:48.362988 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkztg\" (UniqueName: \"kubernetes.io/projected/4183465e-ec48-4ba6-9e3b-67270b1b2951-kube-api-access-mkztg\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:48.363022 master-0 kubenswrapper[13046]: I0308 03:36:48.363020 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b627a32-e3c8-44aa-8488-90159021cbcf-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:48.363167 master-0 kubenswrapper[13046]: I0308 03:36:48.363032 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg6xf\" (UniqueName: \"kubernetes.io/projected/4b627a32-e3c8-44aa-8488-90159021cbcf-kube-api-access-mg6xf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:48.385407 master-0 kubenswrapper[13046]: I0308 03:36:48.385360 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4183465e-ec48-4ba6-9e3b-67270b1b2951" (UID: "4183465e-ec48-4ba6-9e3b-67270b1b2951"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:36:48.401835 master-0 kubenswrapper[13046]: I0308 03:36:48.401759 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config" (OuterVolumeSpecName: "config") pod "4183465e-ec48-4ba6-9e3b-67270b1b2951" (UID: "4183465e-ec48-4ba6-9e3b-67270b1b2951"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:36:48.452950 master-0 kubenswrapper[13046]: I0308 03:36:48.446383 13046 scope.go:117] "RemoveContainer" containerID="0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37" Mar 08 03:36:48.452950 master-0 kubenswrapper[13046]: E0308 03:36:48.450806 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37\": container with ID starting with 0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37 not found: ID does not exist" containerID="0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37" Mar 08 03:36:48.452950 master-0 kubenswrapper[13046]: I0308 03:36:48.450889 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37"} err="failed to get container status \"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37\": rpc error: code = NotFound desc = could not find container \"0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37\": container with ID starting with 0a144a580585a10bd4f12379e436a921a171f2f9309f5ccbf23edafd3ed27d37 not found: ID does not exist" Mar 08 03:36:48.466763 master-0 kubenswrapper[13046]: I0308 03:36:48.465456 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:48.466763 master-0 kubenswrapper[13046]: I0308 03:36:48.465536 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4183465e-ec48-4ba6-9e3b-67270b1b2951-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:48.615012 master-0 kubenswrapper[13046]: I0308 03:36:48.614949 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:48.634266 master-0 kubenswrapper[13046]: I0308 03:36:48.633435 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-h56p6"] Mar 08 03:36:48.660503 master-0 kubenswrapper[13046]: I0308 03:36:48.653545 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:48.667698 master-0 kubenswrapper[13046]: I0308 03:36:48.664261 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-pqtmh"] Mar 08 03:36:49.335506 master-0 kubenswrapper[13046]: I0308 03:36:49.333949 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2","Type":"ContainerStarted","Data":"b8b3d996d279e51508fd7b5c06b8a98d8db482cef60ca5b9e6f6accc57dc2f25"} Mar 08 03:36:49.341506 master-0 kubenswrapper[13046]: I0308 03:36:49.340454 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" event={"ID":"38c2671c-0337-4af7-8a29-eef713b62f67","Type":"ContainerStarted","Data":"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb"} Mar 08 03:36:49.341506 master-0 kubenswrapper[13046]: I0308 03:36:49.340722 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:49.354448 master-0 kubenswrapper[13046]: I0308 03:36:49.354384 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e48517-115a-43d0-ad79-a342efe0cf49","Type":"ContainerStarted","Data":"fcd397f1176ae3be841d4933fab3fb6f7baaa4572b3a12048ac867b352020bf7"} Mar 08 03:36:49.374508 master-0 kubenswrapper[13046]: I0308 03:36:49.373186 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerStarted","Data":"a9b45b7396059ecce416b5911669dc367a4cf158b822a9e12ed734a703811004"} Mar 08 03:36:49.374508 master-0 kubenswrapper[13046]: I0308 03:36:49.374125 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:49.439933 master-0 kubenswrapper[13046]: I0308 03:36:49.439855 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7466868675-jbnsb" podStartSLOduration=5.526142195 podStartE2EDuration="24.439838837s" podCreationTimestamp="2026-03-08 03:36:25 +0000 UTC" firstStartedPulling="2026-03-08 03:36:26.87263256 +0000 UTC m=+1388.951399777" lastFinishedPulling="2026-03-08 03:36:45.786329202 +0000 UTC m=+1407.865096419" observedRunningTime="2026-03-08 03:36:49.426105228 +0000 UTC m=+1411.504872445" watchObservedRunningTime="2026-03-08 03:36:49.439838837 +0000 UTC m=+1411.518606054" Mar 08 03:36:49.561714 master-0 kubenswrapper[13046]: I0308 03:36:49.561609 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" podStartSLOduration=4.950623868 podStartE2EDuration="23.561588718s" podCreationTimestamp="2026-03-08 03:36:26 +0000 UTC" firstStartedPulling="2026-03-08 03:36:27.175717372 +0000 UTC m=+1389.254484589" lastFinishedPulling="2026-03-08 03:36:45.786682222 +0000 UTC m=+1407.865449439" observedRunningTime="2026-03-08 03:36:49.55670016 +0000 UTC m=+1411.635467377" watchObservedRunningTime="2026-03-08 03:36:49.561588718 +0000 UTC m=+1411.640355935" Mar 08 03:36:50.138263 master-0 kubenswrapper[13046]: I0308 03:36:50.138212 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4183465e-ec48-4ba6-9e3b-67270b1b2951" path="/var/lib/kubelet/pods/4183465e-ec48-4ba6-9e3b-67270b1b2951/volumes" Mar 08 03:36:50.139553 master-0 kubenswrapper[13046]: I0308 03:36:50.139518 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b627a32-e3c8-44aa-8488-90159021cbcf" path="/var/lib/kubelet/pods/4b627a32-e3c8-44aa-8488-90159021cbcf/volumes" Mar 08 03:36:55.436376 master-0 kubenswrapper[13046]: I0308 03:36:55.436301 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"96bb7670-d973-44e2-b9f5-887303acf725","Type":"ContainerStarted","Data":"9423e8e61d68a6132d84d1c69af93ab5aba34cf653abbbd667324c655093d837"} Mar 08 03:36:55.436922 master-0 kubenswrapper[13046]: I0308 03:36:55.436439 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 03:36:55.438576 master-0 kubenswrapper[13046]: I0308 03:36:55.438513 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd","Type":"ContainerStarted","Data":"aa635062ff4df3d0fe4d4308c19c4b8eba98ae07ace420b838d830c94f9c66e5"} Mar 08 03:36:55.441425 master-0 kubenswrapper[13046]: I0308 03:36:55.441387 13046 generic.go:334] "Generic (PLEG): container finished" podID="5b658945-9aef-47dc-8600-eb30f696cc3b" containerID="b726f5665c3fa798fb4a53a8cae0cea43b14a16b30cf94f16161d908afcb846a" exitCode=0 Mar 08 03:36:55.441525 master-0 kubenswrapper[13046]: I0308 03:36:55.441450 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5d2gw" event={"ID":"5b658945-9aef-47dc-8600-eb30f696cc3b","Type":"ContainerDied","Data":"b726f5665c3fa798fb4a53a8cae0cea43b14a16b30cf94f16161d908afcb846a"} Mar 08 03:36:55.444099 master-0 kubenswrapper[13046]: I0308 03:36:55.444014 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj" event={"ID":"fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c","Type":"ContainerStarted","Data":"9083c4674eb7fc1a1104e79f02c4fb20e93a9a99f9d7b3d93967f23fed677306"} Mar 08 03:36:55.445089 master-0 kubenswrapper[13046]: I0308 03:36:55.445041 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4wzhj" Mar 08 03:36:55.448261 master-0 kubenswrapper[13046]: I0308 03:36:55.448210 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1098c02e-9145-47f5-b794-cdc3f015a7b5","Type":"ContainerStarted","Data":"1271d0419b20d691b0fcb890a21a160dda699d6dd9dceff400225dfbf5c3e69a"} Mar 08 03:36:55.451081 master-0 kubenswrapper[13046]: I0308 03:36:55.451031 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3533a834-99ca-4bb9-bc59-2c8eeb11a85e","Type":"ContainerStarted","Data":"e2d620be3dfa1ab0c558aee4a7651db5f784b131862a2ed4fa893c7141bd858b"} Mar 08 03:36:55.452977 master-0 kubenswrapper[13046]: I0308 03:36:55.452933 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c186da86-9a0c-48e2-a06a-babcc5d9e02c","Type":"ContainerStarted","Data":"a74345ccd756d7e6da24b7a3e79487d97c8f7a98a9740f102c4fe34ffb74032a"} Mar 08 03:36:55.478584 master-0 kubenswrapper[13046]: I0308 03:36:55.476217 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.211136608 podStartE2EDuration="25.476192348s" podCreationTimestamp="2026-03-08 03:36:30 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.274875177 +0000 UTC m=+1409.353642404" lastFinishedPulling="2026-03-08 03:36:54.539930887 +0000 UTC m=+1416.618698144" observedRunningTime="2026-03-08 03:36:55.469994892 +0000 UTC m=+1417.548762129" watchObservedRunningTime="2026-03-08 03:36:55.476192348 +0000 UTC m=+1417.554959575" Mar 08 03:36:55.510319 master-0 kubenswrapper[13046]: I0308 03:36:55.510199 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4wzhj" podStartSLOduration=11.198388106 podStartE2EDuration="18.510170611s" podCreationTimestamp="2026-03-08 03:36:37 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.24217705 +0000 UTC m=+1409.320944267" lastFinishedPulling="2026-03-08 03:36:54.553959555 +0000 UTC m=+1416.632726772" observedRunningTime="2026-03-08 03:36:55.496764281 +0000 UTC m=+1417.575531518" watchObservedRunningTime="2026-03-08 03:36:55.510170611 +0000 UTC m=+1417.588937838" Mar 08 03:36:56.259466 master-0 kubenswrapper[13046]: I0308 03:36:56.259407 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:56.485564 master-0 kubenswrapper[13046]: I0308 03:36:56.484589 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5d2gw" event={"ID":"5b658945-9aef-47dc-8600-eb30f696cc3b","Type":"ContainerStarted","Data":"cd08efde83f0424b0a40da6e182b784400e4bd5a39b6d13191bc31ddced09c8c"} Mar 08 03:36:56.485564 master-0 kubenswrapper[13046]: I0308 03:36:56.484635 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5d2gw" event={"ID":"5b658945-9aef-47dc-8600-eb30f696cc3b","Type":"ContainerStarted","Data":"e7f1005ce14e13e69219266da15662b62bf1b6a27d554abde28ebbd645ee5257"} Mar 08 03:36:56.486927 master-0 kubenswrapper[13046]: I0308 03:36:56.485911 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:56.486927 master-0 kubenswrapper[13046]: I0308 03:36:56.485938 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:36:56.582453 master-0 kubenswrapper[13046]: I0308 03:36:56.582364 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5d2gw" podStartSLOduration=13.077074261 podStartE2EDuration="19.582343924s" podCreationTimestamp="2026-03-08 03:36:37 +0000 UTC" firstStartedPulling="2026-03-08 03:36:48.048418374 +0000 UTC m=+1410.127185581" lastFinishedPulling="2026-03-08 03:36:54.553687987 +0000 UTC m=+1416.632455244" observedRunningTime="2026-03-08 03:36:56.561085921 +0000 UTC m=+1418.639853148" watchObservedRunningTime="2026-03-08 03:36:56.582343924 +0000 UTC m=+1418.661111141" Mar 08 03:36:56.673145 master-0 kubenswrapper[13046]: I0308 03:36:56.672758 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:36:56.901364 master-0 kubenswrapper[13046]: I0308 03:36:56.899139 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:36:56.901364 master-0 kubenswrapper[13046]: I0308 03:36:56.899391 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7466868675-jbnsb" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="dnsmasq-dns" containerID="cri-o://a9b45b7396059ecce416b5911669dc367a4cf158b822a9e12ed734a703811004" gracePeriod=10 Mar 08 03:36:58.510297 master-0 kubenswrapper[13046]: I0308 03:36:58.510212 13046 generic.go:334] "Generic (PLEG): container finished" podID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerID="a9b45b7396059ecce416b5911669dc367a4cf158b822a9e12ed734a703811004" exitCode=0 Mar 08 03:36:58.510982 master-0 kubenswrapper[13046]: I0308 03:36:58.510398 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerDied","Data":"a9b45b7396059ecce416b5911669dc367a4cf158b822a9e12ed734a703811004"} Mar 08 03:36:59.295137 master-0 kubenswrapper[13046]: I0308 03:36:59.294597 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:59.420108 master-0 kubenswrapper[13046]: I0308 03:36:59.420055 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc\") pod \"b6a524a5-622c-4556-a7d7-de89bb944cd6\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " Mar 08 03:36:59.420314 master-0 kubenswrapper[13046]: I0308 03:36:59.420143 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config\") pod \"b6a524a5-622c-4556-a7d7-de89bb944cd6\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " Mar 08 03:36:59.420314 master-0 kubenswrapper[13046]: I0308 03:36:59.420195 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxcbs\" (UniqueName: \"kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs\") pod \"b6a524a5-622c-4556-a7d7-de89bb944cd6\" (UID: \"b6a524a5-622c-4556-a7d7-de89bb944cd6\") " Mar 08 03:36:59.441989 master-0 kubenswrapper[13046]: I0308 03:36:59.441930 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs" (OuterVolumeSpecName: "kube-api-access-hxcbs") pod "b6a524a5-622c-4556-a7d7-de89bb944cd6" (UID: "b6a524a5-622c-4556-a7d7-de89bb944cd6"). InnerVolumeSpecName "kube-api-access-hxcbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:36:59.492613 master-0 kubenswrapper[13046]: I0308 03:36:59.490016 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6a524a5-622c-4556-a7d7-de89bb944cd6" (UID: "b6a524a5-622c-4556-a7d7-de89bb944cd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:36:59.492613 master-0 kubenswrapper[13046]: I0308 03:36:59.490030 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config" (OuterVolumeSpecName: "config") pod "b6a524a5-622c-4556-a7d7-de89bb944cd6" (UID: "b6a524a5-622c-4556-a7d7-de89bb944cd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:36:59.524039 master-0 kubenswrapper[13046]: I0308 03:36:59.522434 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:59.524039 master-0 kubenswrapper[13046]: I0308 03:36:59.522496 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a524a5-622c-4556-a7d7-de89bb944cd6-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:59.524039 master-0 kubenswrapper[13046]: I0308 03:36:59.522513 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxcbs\" (UniqueName: \"kubernetes.io/projected/b6a524a5-622c-4556-a7d7-de89bb944cd6-kube-api-access-hxcbs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:36:59.548601 master-0 kubenswrapper[13046]: I0308 03:36:59.548531 13046 generic.go:334] "Generic (PLEG): container finished" podID="aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd" containerID="aa635062ff4df3d0fe4d4308c19c4b8eba98ae07ace420b838d830c94f9c66e5" exitCode=0 Mar 08 03:36:59.548805 master-0 kubenswrapper[13046]: I0308 03:36:59.548619 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd","Type":"ContainerDied","Data":"aa635062ff4df3d0fe4d4308c19c4b8eba98ae07ace420b838d830c94f9c66e5"} Mar 08 03:36:59.557174 master-0 kubenswrapper[13046]: I0308 03:36:59.557136 13046 generic.go:334] "Generic (PLEG): container finished" podID="1098c02e-9145-47f5-b794-cdc3f015a7b5" containerID="1271d0419b20d691b0fcb890a21a160dda699d6dd9dceff400225dfbf5c3e69a" exitCode=0 Mar 08 03:36:59.557322 master-0 kubenswrapper[13046]: I0308 03:36:59.557192 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1098c02e-9145-47f5-b794-cdc3f015a7b5","Type":"ContainerDied","Data":"1271d0419b20d691b0fcb890a21a160dda699d6dd9dceff400225dfbf5c3e69a"} Mar 08 03:36:59.568630 master-0 kubenswrapper[13046]: I0308 03:36:59.567415 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-jbnsb" event={"ID":"b6a524a5-622c-4556-a7d7-de89bb944cd6","Type":"ContainerDied","Data":"cb68db2c2d8e0b5b38c4c5a1bfbbf0bc6e4029863405919fcdbe3e5f9c98103f"} Mar 08 03:36:59.568630 master-0 kubenswrapper[13046]: I0308 03:36:59.567461 13046 scope.go:117] "RemoveContainer" containerID="a9b45b7396059ecce416b5911669dc367a4cf158b822a9e12ed734a703811004" Mar 08 03:36:59.568630 master-0 kubenswrapper[13046]: I0308 03:36:59.567458 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-jbnsb" Mar 08 03:36:59.594589 master-0 kubenswrapper[13046]: I0308 03:36:59.593964 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3533a834-99ca-4bb9-bc59-2c8eeb11a85e","Type":"ContainerStarted","Data":"c5bfe2569d6dc4ba23ad87c6900bc3df024678dbff099b0bc748e1b2f418f434"} Mar 08 03:36:59.597740 master-0 kubenswrapper[13046]: I0308 03:36:59.596834 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c186da86-9a0c-48e2-a06a-babcc5d9e02c","Type":"ContainerStarted","Data":"afed951d341821dbee9cf21356759f2db70dfca5bf3d875174d8f9472ffc95bf"} Mar 08 03:36:59.661352 master-0 kubenswrapper[13046]: I0308 03:36:59.660759 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.880065812 podStartE2EDuration="23.660738936s" podCreationTimestamp="2026-03-08 03:36:36 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.24394993 +0000 UTC m=+1409.322717147" lastFinishedPulling="2026-03-08 03:36:59.024623054 +0000 UTC m=+1421.103390271" observedRunningTime="2026-03-08 03:36:59.634567904 +0000 UTC m=+1421.713335131" watchObservedRunningTime="2026-03-08 03:36:59.660738936 +0000 UTC m=+1421.739506153" Mar 08 03:36:59.688297 master-0 kubenswrapper[13046]: I0308 03:36:59.688217 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.424702731 podStartE2EDuration="21.688191324s" podCreationTimestamp="2026-03-08 03:36:38 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.744943092 +0000 UTC m=+1409.823710309" lastFinishedPulling="2026-03-08 03:36:59.008431685 +0000 UTC m=+1421.087198902" observedRunningTime="2026-03-08 03:36:59.65805417 +0000 UTC m=+1421.736821387" watchObservedRunningTime="2026-03-08 03:36:59.688191324 +0000 UTC m=+1421.766958541" Mar 08 03:36:59.688453 master-0 kubenswrapper[13046]: I0308 03:36:59.688392 13046 scope.go:117] "RemoveContainer" containerID="1e6b42ee79a24892490cfad73780ec5a80e601c817d29dc2bfa94408478d33eb" Mar 08 03:36:59.713051 master-0 kubenswrapper[13046]: I0308 03:36:59.712980 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:36:59.725548 master-0 kubenswrapper[13046]: I0308 03:36:59.725467 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7466868675-jbnsb"] Mar 08 03:37:00.133146 master-0 kubenswrapper[13046]: I0308 03:37:00.133065 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" path="/var/lib/kubelet/pods/b6a524a5-622c-4556-a7d7-de89bb944cd6/volumes" Mar 08 03:37:00.616560 master-0 kubenswrapper[13046]: I0308 03:37:00.616418 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd","Type":"ContainerStarted","Data":"aba8a6f9bceb78b9a97a4aaef651275415b6c2cddd115da813291f71f9355e12"} Mar 08 03:37:00.619466 master-0 kubenswrapper[13046]: I0308 03:37:00.619404 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1098c02e-9145-47f5-b794-cdc3f015a7b5","Type":"ContainerStarted","Data":"deabd5320a723cb2b476bf98e4c458081749e7e7b0288c55d398da6d3bd2329b"} Mar 08 03:37:00.657519 master-0 kubenswrapper[13046]: I0308 03:37:00.649940 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 03:37:00.658605 master-0 kubenswrapper[13046]: I0308 03:37:00.657803 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.332409209 podStartE2EDuration="33.657775729s" podCreationTimestamp="2026-03-08 03:36:27 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.275065542 +0000 UTC m=+1409.353832769" lastFinishedPulling="2026-03-08 03:36:54.600432062 +0000 UTC m=+1416.679199289" observedRunningTime="2026-03-08 03:37:00.644282807 +0000 UTC m=+1422.723050054" watchObservedRunningTime="2026-03-08 03:37:00.657775729 +0000 UTC m=+1422.736542986" Mar 08 03:37:00.696646 master-0 kubenswrapper[13046]: I0308 03:37:00.695817 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.366194957 podStartE2EDuration="30.695760336s" podCreationTimestamp="2026-03-08 03:36:30 +0000 UTC" firstStartedPulling="2026-03-08 03:36:47.226204827 +0000 UTC m=+1409.304972084" lastFinishedPulling="2026-03-08 03:36:54.555770246 +0000 UTC m=+1416.634537463" observedRunningTime="2026-03-08 03:37:00.690037224 +0000 UTC m=+1422.768804471" watchObservedRunningTime="2026-03-08 03:37:00.695760336 +0000 UTC m=+1422.774527593" Mar 08 03:37:00.998345 master-0 kubenswrapper[13046]: I0308 03:37:00.998137 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 03:37:01.072813 master-0 kubenswrapper[13046]: I0308 03:37:01.072727 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 03:37:01.489641 master-0 kubenswrapper[13046]: I0308 03:37:01.489565 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 03:37:01.637296 master-0 kubenswrapper[13046]: I0308 03:37:01.637222 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 03:37:01.649865 master-0 kubenswrapper[13046]: I0308 03:37:01.649781 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 03:37:01.683455 master-0 kubenswrapper[13046]: I0308 03:37:01.683359 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 03:37:01.698940 master-0 kubenswrapper[13046]: I0308 03:37:01.698873 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 03:37:01.937334 master-0 kubenswrapper[13046]: I0308 03:37:01.936924 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: E0308 03:37:01.937696 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4183465e-ec48-4ba6-9e3b-67270b1b2951" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.937721 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4183465e-ec48-4ba6-9e3b-67270b1b2951" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: E0308 03:37:01.937772 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="dnsmasq-dns" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.937781 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="dnsmasq-dns" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: E0308 03:37:01.937798 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.937807 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: E0308 03:37:01.937825 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b627a32-e3c8-44aa-8488-90159021cbcf" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.937832 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b627a32-e3c8-44aa-8488-90159021cbcf" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.938043 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b627a32-e3c8-44aa-8488-90159021cbcf" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.938071 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a524a5-622c-4556-a7d7-de89bb944cd6" containerName="dnsmasq-dns" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.938087 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="4183465e-ec48-4ba6-9e3b-67270b1b2951" containerName="init" Mar 08 03:37:01.950030 master-0 kubenswrapper[13046]: I0308 03:37:01.942189 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:01.955629 master-0 kubenswrapper[13046]: I0308 03:37:01.952651 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 03:37:01.982726 master-0 kubenswrapper[13046]: I0308 03:37:01.982672 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:01.982941 master-0 kubenswrapper[13046]: I0308 03:37:01.982739 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:01.982941 master-0 kubenswrapper[13046]: I0308 03:37:01.982770 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2k4\" (UniqueName: \"kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:01.982941 master-0 kubenswrapper[13046]: I0308 03:37:01.982901 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:01.993598 master-0 kubenswrapper[13046]: I0308 03:37:01.993545 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:02.030561 master-0 kubenswrapper[13046]: I0308 03:37:02.030131 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-sxqz2"] Mar 08 03:37:02.031529 master-0 kubenswrapper[13046]: I0308 03:37:02.031506 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.033853 master-0 kubenswrapper[13046]: I0308 03:37:02.033830 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 03:37:02.038978 master-0 kubenswrapper[13046]: I0308 03:37:02.038931 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sxqz2"] Mar 08 03:37:02.083993 master-0 kubenswrapper[13046]: I0308 03:37:02.083933 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.083993 master-0 kubenswrapper[13046]: I0308 03:37:02.083994 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0188fe7d-a015-4d70-b6ab-a001523d4ebd-config\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084026 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084047 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084066 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovn-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084096 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovs-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084158 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084191 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084213 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spbw\" (UniqueName: \"kubernetes.io/projected/0188fe7d-a015-4d70-b6ab-a001523d4ebd-kube-api-access-5spbw\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.084257 master-0 kubenswrapper[13046]: I0308 03:37:02.084233 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2k4\" (UniqueName: \"kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.087294 master-0 kubenswrapper[13046]: I0308 03:37:02.087251 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.088227 master-0 kubenswrapper[13046]: I0308 03:37:02.088187 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.095357 master-0 kubenswrapper[13046]: I0308 03:37:02.095265 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.102281 master-0 kubenswrapper[13046]: I0308 03:37:02.102234 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2k4\" (UniqueName: \"kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4\") pod \"dnsmasq-dns-5998b4894c-7wgvq\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.186793 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.187930 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0188fe7d-a015-4d70-b6ab-a001523d4ebd-config\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.187962 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.187985 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovn-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.188034 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovs-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.188190 master-0 kubenswrapper[13046]: I0308 03:37:02.188178 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spbw\" (UniqueName: \"kubernetes.io/projected/0188fe7d-a015-4d70-b6ab-a001523d4ebd-kube-api-access-5spbw\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.189712 master-0 kubenswrapper[13046]: I0308 03:37:02.189683 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0188fe7d-a015-4d70-b6ab-a001523d4ebd-config\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.190894 master-0 kubenswrapper[13046]: I0308 03:37:02.190852 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-combined-ca-bundle\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.190995 master-0 kubenswrapper[13046]: I0308 03:37:02.190949 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovn-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.191425 master-0 kubenswrapper[13046]: I0308 03:37:02.191363 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0188fe7d-a015-4d70-b6ab-a001523d4ebd-ovs-rundir\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.196147 master-0 kubenswrapper[13046]: I0308 03:37:02.196109 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188fe7d-a015-4d70-b6ab-a001523d4ebd-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.264971 master-0 kubenswrapper[13046]: I0308 03:37:02.264898 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:02.313072 master-0 kubenswrapper[13046]: I0308 03:37:02.313012 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spbw\" (UniqueName: \"kubernetes.io/projected/0188fe7d-a015-4d70-b6ab-a001523d4ebd-kube-api-access-5spbw\") pod \"ovn-controller-metrics-sxqz2\" (UID: \"0188fe7d-a015-4d70-b6ab-a001523d4ebd\") " pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.363567 master-0 kubenswrapper[13046]: I0308 03:37:02.360769 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sxqz2" Mar 08 03:37:02.712181 master-0 kubenswrapper[13046]: I0308 03:37:02.712122 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 03:37:03.020076 master-0 kubenswrapper[13046]: I0308 03:37:03.020008 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:03.074784 master-0 kubenswrapper[13046]: W0308 03:37:03.074728 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0188fe7d_a015_4d70_b6ab_a001523d4ebd.slice/crio-a8d9285a5ba93152c65d54e1e0a904be2f670e83b3ad00ffd743d9a8825819c0 WatchSource:0}: Error finding container a8d9285a5ba93152c65d54e1e0a904be2f670e83b3ad00ffd743d9a8825819c0: Status 404 returned error can't find the container with id a8d9285a5ba93152c65d54e1e0a904be2f670e83b3ad00ffd743d9a8825819c0 Mar 08 03:37:03.085125 master-0 kubenswrapper[13046]: I0308 03:37:03.085071 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sxqz2"] Mar 08 03:37:03.117564 master-0 kubenswrapper[13046]: I0308 03:37:03.113338 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:03.117564 master-0 kubenswrapper[13046]: I0308 03:37:03.115097 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.123562 master-0 kubenswrapper[13046]: I0308 03:37:03.123526 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 03:37:03.126267 master-0 kubenswrapper[13046]: W0308 03:37:03.126191 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b61e2e_440c_4dff_a101_5ec0ad3d8c89.slice/crio-4d3232f04ad0442ad3a02cc9ad74ba6fc06d0a0cd05a267a7d863a00d2bd7d35 WatchSource:0}: Error finding container 4d3232f04ad0442ad3a02cc9ad74ba6fc06d0a0cd05a267a7d863a00d2bd7d35: Status 404 returned error can't find the container with id 4d3232f04ad0442ad3a02cc9ad74ba6fc06d0a0cd05a267a7d863a00d2bd7d35 Mar 08 03:37:03.129687 master-0 kubenswrapper[13046]: I0308 03:37:03.129632 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.213100 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.213182 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.213215 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.213346 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.213402 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxv64\" (UniqueName: \"kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.217538 master-0 kubenswrapper[13046]: I0308 03:37:03.215898 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:03.315169 master-0 kubenswrapper[13046]: I0308 03:37:03.315113 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.315169 master-0 kubenswrapper[13046]: I0308 03:37:03.315180 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.315520 master-0 kubenswrapper[13046]: I0308 03:37:03.315246 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.315698 master-0 kubenswrapper[13046]: I0308 03:37:03.315671 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxv64\" (UniqueName: \"kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.315782 master-0 kubenswrapper[13046]: I0308 03:37:03.315761 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.316239 master-0 kubenswrapper[13046]: I0308 03:37:03.316188 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.316282 master-0 kubenswrapper[13046]: I0308 03:37:03.316216 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.316579 master-0 kubenswrapper[13046]: I0308 03:37:03.316556 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.316795 master-0 kubenswrapper[13046]: I0308 03:37:03.316763 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.426596 master-0 kubenswrapper[13046]: I0308 03:37:03.426081 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxv64\" (UniqueName: \"kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64\") pod \"dnsmasq-dns-58dc6c9559-qp5cc\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.649236 master-0 kubenswrapper[13046]: I0308 03:37:03.649175 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:03.687567 master-0 kubenswrapper[13046]: I0308 03:37:03.687503 13046 generic.go:334] "Generic (PLEG): container finished" podID="f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" containerID="b4a137ab246ed049c45cbf5c2a01b157055bb8abc5383e86966804bd1d8c345a" exitCode=0 Mar 08 03:37:03.687649 master-0 kubenswrapper[13046]: I0308 03:37:03.687599 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" event={"ID":"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89","Type":"ContainerDied","Data":"b4a137ab246ed049c45cbf5c2a01b157055bb8abc5383e86966804bd1d8c345a"} Mar 08 03:37:03.687649 master-0 kubenswrapper[13046]: I0308 03:37:03.687627 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" event={"ID":"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89","Type":"ContainerStarted","Data":"4d3232f04ad0442ad3a02cc9ad74ba6fc06d0a0cd05a267a7d863a00d2bd7d35"} Mar 08 03:37:03.695863 master-0 kubenswrapper[13046]: I0308 03:37:03.695820 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sxqz2" event={"ID":"0188fe7d-a015-4d70-b6ab-a001523d4ebd","Type":"ContainerStarted","Data":"add75319018398c753595a08602b9420bb17f6f93ca7d1f5b746e766ca37590b"} Mar 08 03:37:03.695946 master-0 kubenswrapper[13046]: I0308 03:37:03.695874 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sxqz2" event={"ID":"0188fe7d-a015-4d70-b6ab-a001523d4ebd","Type":"ContainerStarted","Data":"a8d9285a5ba93152c65d54e1e0a904be2f670e83b3ad00ffd743d9a8825819c0"} Mar 08 03:37:04.099534 master-0 kubenswrapper[13046]: I0308 03:37:04.097123 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 03:37:04.103336 master-0 kubenswrapper[13046]: I0308 03:37:04.102311 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 03:37:04.108037 master-0 kubenswrapper[13046]: I0308 03:37:04.107031 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 03:37:04.108037 master-0 kubenswrapper[13046]: I0308 03:37:04.107078 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 03:37:04.110498 master-0 kubenswrapper[13046]: I0308 03:37:04.109331 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 03:37:04.228511 master-0 kubenswrapper[13046]: I0308 03:37:04.202610 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 03:37:04.228511 master-0 kubenswrapper[13046]: I0308 03:37:04.216241 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-sxqz2" podStartSLOduration=3.216219299 podStartE2EDuration="3.216219299s" podCreationTimestamp="2026-03-08 03:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:04.177854912 +0000 UTC m=+1426.256622129" watchObservedRunningTime="2026-03-08 03:37:04.216219299 +0000 UTC m=+1426.294986516" Mar 08 03:37:04.235706 master-0 kubenswrapper[13046]: I0308 03:37:04.234075 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:04.347508 master-0 kubenswrapper[13046]: I0308 03:37:04.347087 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb\") pod \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " Mar 08 03:37:04.347705 master-0 kubenswrapper[13046]: I0308 03:37:04.347666 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config\") pod \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " Mar 08 03:37:04.347745 master-0 kubenswrapper[13046]: I0308 03:37:04.347726 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt2k4\" (UniqueName: \"kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4\") pod \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " Mar 08 03:37:04.347786 master-0 kubenswrapper[13046]: I0308 03:37:04.347751 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc\") pod \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\" (UID: \"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89\") " Mar 08 03:37:04.348190 master-0 kubenswrapper[13046]: I0308 03:37:04.348168 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47992c73-c637-40e5-955f-9738ece43dc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348242 master-0 kubenswrapper[13046]: I0308 03:37:04.348222 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-config\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348285 master-0 kubenswrapper[13046]: I0308 03:37:04.348266 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348342 master-0 kubenswrapper[13046]: I0308 03:37:04.348325 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-scripts\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348412 master-0 kubenswrapper[13046]: I0308 03:37:04.348398 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5pm\" (UniqueName: \"kubernetes.io/projected/47992c73-c637-40e5-955f-9738ece43dc5-kube-api-access-rs5pm\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348704 master-0 kubenswrapper[13046]: I0308 03:37:04.348658 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.348755 master-0 kubenswrapper[13046]: I0308 03:37:04.348739 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.358599 master-0 kubenswrapper[13046]: W0308 03:37:04.356394 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada12bf8_e0a1_46dd_9a45_876157b33dd6.slice/crio-11f236aec72b8ea3e67cc12f65ec1a75f24fef77706f381f990adc7cf964b6c4 WatchSource:0}: Error finding container 11f236aec72b8ea3e67cc12f65ec1a75f24fef77706f381f990adc7cf964b6c4: Status 404 returned error can't find the container with id 11f236aec72b8ea3e67cc12f65ec1a75f24fef77706f381f990adc7cf964b6c4 Mar 08 03:37:04.360230 master-0 kubenswrapper[13046]: I0308 03:37:04.360189 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4" (OuterVolumeSpecName: "kube-api-access-bt2k4") pod "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" (UID: "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89"). InnerVolumeSpecName "kube-api-access-bt2k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:04.373381 master-0 kubenswrapper[13046]: I0308 03:37:04.372794 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:04.384304 master-0 kubenswrapper[13046]: I0308 03:37:04.384164 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" (UID: "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:04.393154 master-0 kubenswrapper[13046]: I0308 03:37:04.393098 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config" (OuterVolumeSpecName: "config") pod "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" (UID: "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:04.404323 master-0 kubenswrapper[13046]: I0308 03:37:04.404278 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" (UID: "f8b61e2e-440c-4dff-a101-5ec0ad3d8c89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:04.450690 master-0 kubenswrapper[13046]: I0308 03:37:04.450588 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-config\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.451125 master-0 kubenswrapper[13046]: I0308 03:37:04.450996 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.451201 master-0 kubenswrapper[13046]: I0308 03:37:04.451147 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-scripts\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.451414 master-0 kubenswrapper[13046]: I0308 03:37:04.451272 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5pm\" (UniqueName: \"kubernetes.io/projected/47992c73-c637-40e5-955f-9738ece43dc5-kube-api-access-rs5pm\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.451672 master-0 kubenswrapper[13046]: I0308 03:37:04.451622 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-config\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.453415 master-0 kubenswrapper[13046]: I0308 03:37:04.453210 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.453415 master-0 kubenswrapper[13046]: I0308 03:37:04.453300 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.453575 master-0 kubenswrapper[13046]: I0308 03:37:04.453451 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47992c73-c637-40e5-955f-9738ece43dc5-scripts\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.453575 master-0 kubenswrapper[13046]: I0308 03:37:04.453532 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47992c73-c637-40e5-955f-9738ece43dc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.453667 master-0 kubenswrapper[13046]: I0308 03:37:04.453625 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:04.453667 master-0 kubenswrapper[13046]: I0308 03:37:04.453639 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:04.453667 master-0 kubenswrapper[13046]: I0308 03:37:04.453649 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt2k4\" (UniqueName: \"kubernetes.io/projected/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-kube-api-access-bt2k4\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:04.453667 master-0 kubenswrapper[13046]: I0308 03:37:04.453661 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:04.454117 master-0 kubenswrapper[13046]: I0308 03:37:04.454035 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47992c73-c637-40e5-955f-9738ece43dc5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.456946 master-0 kubenswrapper[13046]: I0308 03:37:04.456891 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.458289 master-0 kubenswrapper[13046]: I0308 03:37:04.458257 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.459355 master-0 kubenswrapper[13046]: I0308 03:37:04.459256 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47992c73-c637-40e5-955f-9738ece43dc5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.708100 master-0 kubenswrapper[13046]: I0308 03:37:04.708036 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" event={"ID":"f8b61e2e-440c-4dff-a101-5ec0ad3d8c89","Type":"ContainerDied","Data":"4d3232f04ad0442ad3a02cc9ad74ba6fc06d0a0cd05a267a7d863a00d2bd7d35"} Mar 08 03:37:04.708280 master-0 kubenswrapper[13046]: I0308 03:37:04.708115 13046 scope.go:117] "RemoveContainer" containerID="b4a137ab246ed049c45cbf5c2a01b157055bb8abc5383e86966804bd1d8c345a" Mar 08 03:37:04.708280 master-0 kubenswrapper[13046]: I0308 03:37:04.708241 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5998b4894c-7wgvq" Mar 08 03:37:04.715870 master-0 kubenswrapper[13046]: I0308 03:37:04.715806 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerStarted","Data":"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3"} Mar 08 03:37:04.716034 master-0 kubenswrapper[13046]: I0308 03:37:04.715883 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerStarted","Data":"11f236aec72b8ea3e67cc12f65ec1a75f24fef77706f381f990adc7cf964b6c4"} Mar 08 03:37:04.792009 master-0 kubenswrapper[13046]: I0308 03:37:04.791950 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5pm\" (UniqueName: \"kubernetes.io/projected/47992c73-c637-40e5-955f-9738ece43dc5-kube-api-access-rs5pm\") pod \"ovn-northd-0\" (UID: \"47992c73-c637-40e5-955f-9738ece43dc5\") " pod="openstack/ovn-northd-0" Mar 08 03:37:04.822597 master-0 kubenswrapper[13046]: I0308 03:37:04.821975 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 03:37:04.860526 master-0 kubenswrapper[13046]: I0308 03:37:04.860425 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 03:37:04.860526 master-0 kubenswrapper[13046]: I0308 03:37:04.860478 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 03:37:05.388587 master-0 kubenswrapper[13046]: I0308 03:37:05.387788 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:05.407590 master-0 kubenswrapper[13046]: I0308 03:37:05.407540 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 03:37:05.427864 master-0 kubenswrapper[13046]: I0308 03:37:05.427809 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:05.547844 master-0 kubenswrapper[13046]: I0308 03:37:05.546692 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5998b4894c-7wgvq"] Mar 08 03:37:05.577523 master-0 kubenswrapper[13046]: I0308 03:37:05.576923 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:05.577747 master-0 kubenswrapper[13046]: E0308 03:37:05.577538 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" containerName="init" Mar 08 03:37:05.577747 master-0 kubenswrapper[13046]: I0308 03:37:05.577559 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" containerName="init" Mar 08 03:37:05.577863 master-0 kubenswrapper[13046]: I0308 03:37:05.577796 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" containerName="init" Mar 08 03:37:05.581600 master-0 kubenswrapper[13046]: I0308 03:37:05.579176 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.642558 master-0 kubenswrapper[13046]: I0308 03:37:05.635756 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:05.692034 master-0 kubenswrapper[13046]: I0308 03:37:05.690791 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.692034 master-0 kubenswrapper[13046]: I0308 03:37:05.690903 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.692034 master-0 kubenswrapper[13046]: I0308 03:37:05.690942 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqt7\" (UniqueName: \"kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.692034 master-0 kubenswrapper[13046]: I0308 03:37:05.691023 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.692034 master-0 kubenswrapper[13046]: I0308 03:37:05.691064 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.738793 master-0 kubenswrapper[13046]: I0308 03:37:05.738658 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47992c73-c637-40e5-955f-9738ece43dc5","Type":"ContainerStarted","Data":"c3fad6376e2d161eb847cb5a40bfe23e7e3c8a9677c6388ea920c59b6bb09fd2"} Mar 08 03:37:05.740224 master-0 kubenswrapper[13046]: I0308 03:37:05.740186 13046 generic.go:334] "Generic (PLEG): container finished" podID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerID="7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3" exitCode=0 Mar 08 03:37:05.740224 master-0 kubenswrapper[13046]: I0308 03:37:05.740215 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerDied","Data":"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3"} Mar 08 03:37:05.792399 master-0 kubenswrapper[13046]: I0308 03:37:05.792352 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.792576 master-0 kubenswrapper[13046]: I0308 03:37:05.792422 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqt7\" (UniqueName: \"kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.792576 master-0 kubenswrapper[13046]: I0308 03:37:05.792529 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.792668 master-0 kubenswrapper[13046]: I0308 03:37:05.792575 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.792668 master-0 kubenswrapper[13046]: I0308 03:37:05.792605 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.793385 master-0 kubenswrapper[13046]: I0308 03:37:05.793364 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.793931 master-0 kubenswrapper[13046]: I0308 03:37:05.793905 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.795588 master-0 kubenswrapper[13046]: I0308 03:37:05.795552 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.795761 master-0 kubenswrapper[13046]: I0308 03:37:05.795720 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.807768 master-0 kubenswrapper[13046]: I0308 03:37:05.807733 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqt7\" (UniqueName: \"kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7\") pod \"dnsmasq-dns-d6c6c44c5-66sbz\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:05.875321 master-0 kubenswrapper[13046]: I0308 03:37:05.875259 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 03:37:05.875321 master-0 kubenswrapper[13046]: I0308 03:37:05.875310 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 03:37:05.906795 master-0 kubenswrapper[13046]: I0308 03:37:05.906700 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:06.175561 master-0 kubenswrapper[13046]: I0308 03:37:06.163940 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b61e2e-440c-4dff-a101-5ec0ad3d8c89" path="/var/lib/kubelet/pods/f8b61e2e-440c-4dff-a101-5ec0ad3d8c89/volumes" Mar 08 03:37:06.296354 master-0 kubenswrapper[13046]: I0308 03:37:06.292380 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 03:37:06.676088 master-0 kubenswrapper[13046]: I0308 03:37:06.676018 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:06.758709 master-0 kubenswrapper[13046]: I0308 03:37:06.757806 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerStarted","Data":"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60"} Mar 08 03:37:06.758709 master-0 kubenswrapper[13046]: I0308 03:37:06.758283 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="dnsmasq-dns" containerID="cri-o://780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60" gracePeriod=10 Mar 08 03:37:06.798932 master-0 kubenswrapper[13046]: I0308 03:37:06.798860 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" podStartSLOduration=4.798839188 podStartE2EDuration="4.798839188s" podCreationTimestamp="2026-03-08 03:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:06.788387592 +0000 UTC m=+1428.867154799" watchObservedRunningTime="2026-03-08 03:37:06.798839188 +0000 UTC m=+1428.877606405" Mar 08 03:37:06.801643 master-0 kubenswrapper[13046]: I0308 03:37:06.801575 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 03:37:06.819131 master-0 kubenswrapper[13046]: I0308 03:37:06.819071 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 03:37:06.822844 master-0 kubenswrapper[13046]: I0308 03:37:06.822799 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 03:37:06.823091 master-0 kubenswrapper[13046]: I0308 03:37:06.823048 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 03:37:06.823438 master-0 kubenswrapper[13046]: I0308 03:37:06.823396 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 03:37:06.827279 master-0 kubenswrapper[13046]: I0308 03:37:06.827198 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 03:37:06.863330 master-0 kubenswrapper[13046]: I0308 03:37:06.862904 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 03:37:06.946237 master-0 kubenswrapper[13046]: I0308 03:37:06.946192 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:06.946724 master-0 kubenswrapper[13046]: I0308 03:37:06.946692 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9dc808e4-3438-43cc-ae1d-5305ced8e3b9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0893f9fd-26b1-4ab8-ab8b-5d1166264044\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:06.946835 master-0 kubenswrapper[13046]: I0308 03:37:06.946819 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:06.946955 master-0 kubenswrapper[13046]: I0308 03:37:06.946942 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-cache\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:06.947074 master-0 kubenswrapper[13046]: I0308 03:37:06.947060 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-lock\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:06.947161 master-0 kubenswrapper[13046]: I0308 03:37:06.947147 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc9cr\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-kube-api-access-lc9cr\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.035354 master-0 kubenswrapper[13046]: W0308 03:37:07.035288 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode26fb299_b4b3_4f84_acb9_82afd62a9c39.slice/crio-b2569c51cfa7ef60b635cd8aec00a49b1f725894db0715619719c044df012241 WatchSource:0}: Error finding container b2569c51cfa7ef60b635cd8aec00a49b1f725894db0715619719c044df012241: Status 404 returned error can't find the container with id b2569c51cfa7ef60b635cd8aec00a49b1f725894db0715619719c044df012241 Mar 08 03:37:07.049106 master-0 kubenswrapper[13046]: I0308 03:37:07.049059 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-lock\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.049229 master-0 kubenswrapper[13046]: I0308 03:37:07.049134 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc9cr\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-kube-api-access-lc9cr\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.049229 master-0 kubenswrapper[13046]: I0308 03:37:07.049201 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.049359 master-0 kubenswrapper[13046]: I0308 03:37:07.049336 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9dc808e4-3438-43cc-ae1d-5305ced8e3b9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0893f9fd-26b1-4ab8-ab8b-5d1166264044\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.049428 master-0 kubenswrapper[13046]: I0308 03:37:07.049375 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.049475 master-0 kubenswrapper[13046]: I0308 03:37:07.049429 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-cache\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.050002 master-0 kubenswrapper[13046]: I0308 03:37:07.049977 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-cache\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.050301 master-0 kubenswrapper[13046]: I0308 03:37:07.050277 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-lock\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.050705 master-0 kubenswrapper[13046]: E0308 03:37:07.050681 13046 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 03:37:07.050705 master-0 kubenswrapper[13046]: E0308 03:37:07.050706 13046 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 03:37:07.051098 master-0 kubenswrapper[13046]: E0308 03:37:07.050750 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift podName:bdcf318b-3d3e-42da-ae4b-39c6a17f8437 nodeName:}" failed. No retries permitted until 2026-03-08 03:37:07.550732858 +0000 UTC m=+1429.629500075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift") pod "swift-storage-0" (UID: "bdcf318b-3d3e-42da-ae4b-39c6a17f8437") : configmap "swift-ring-files" not found Mar 08 03:37:07.053518 master-0 kubenswrapper[13046]: I0308 03:37:07.052962 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:37:07.053518 master-0 kubenswrapper[13046]: I0308 03:37:07.053013 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9dc808e4-3438-43cc-ae1d-5305ced8e3b9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0893f9fd-26b1-4ab8-ab8b-5d1166264044\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4b7bd17cdfc7d007f512615258a3556af41a30bd47ac2286eda7aaa1131c14e8/globalmount\"" pod="openstack/swift-storage-0" Mar 08 03:37:07.054471 master-0 kubenswrapper[13046]: I0308 03:37:07.054405 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.075769 master-0 kubenswrapper[13046]: I0308 03:37:07.075705 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc9cr\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-kube-api-access-lc9cr\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.548327 master-0 kubenswrapper[13046]: I0308 03:37:07.547894 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.590698 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc\") pod \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.590795 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config\") pod \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.590845 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb\") pod \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.590864 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb\") pod \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.590899 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxv64\" (UniqueName: \"kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64\") pod \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\" (UID: \"ada12bf8-e0a1-46dd-9a45-876157b33dd6\") " Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: I0308 03:37:07.591107 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: E0308 03:37:07.591354 13046 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: E0308 03:37:07.591368 13046 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 03:37:07.616551 master-0 kubenswrapper[13046]: E0308 03:37:07.591410 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift podName:bdcf318b-3d3e-42da-ae4b-39c6a17f8437 nodeName:}" failed. No retries permitted until 2026-03-08 03:37:08.591396665 +0000 UTC m=+1430.670163882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift") pod "swift-storage-0" (UID: "bdcf318b-3d3e-42da-ae4b-39c6a17f8437") : configmap "swift-ring-files" not found Mar 08 03:37:07.637855 master-0 kubenswrapper[13046]: I0308 03:37:07.637801 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64" (OuterVolumeSpecName: "kube-api-access-jxv64") pod "ada12bf8-e0a1-46dd-9a45-876157b33dd6" (UID: "ada12bf8-e0a1-46dd-9a45-876157b33dd6"). InnerVolumeSpecName "kube-api-access-jxv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:07.693464 master-0 kubenswrapper[13046]: I0308 03:37:07.693364 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxv64\" (UniqueName: \"kubernetes.io/projected/ada12bf8-e0a1-46dd-9a45-876157b33dd6-kube-api-access-jxv64\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:07.693464 master-0 kubenswrapper[13046]: I0308 03:37:07.693392 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ada12bf8-e0a1-46dd-9a45-876157b33dd6" (UID: "ada12bf8-e0a1-46dd-9a45-876157b33dd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:07.711537 master-0 kubenswrapper[13046]: I0308 03:37:07.711373 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ada12bf8-e0a1-46dd-9a45-876157b33dd6" (UID: "ada12bf8-e0a1-46dd-9a45-876157b33dd6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:07.718679 master-0 kubenswrapper[13046]: I0308 03:37:07.718623 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ada12bf8-e0a1-46dd-9a45-876157b33dd6" (UID: "ada12bf8-e0a1-46dd-9a45-876157b33dd6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:07.733373 master-0 kubenswrapper[13046]: I0308 03:37:07.733320 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 03:37:07.744843 master-0 kubenswrapper[13046]: I0308 03:37:07.744746 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config" (OuterVolumeSpecName: "config") pod "ada12bf8-e0a1-46dd-9a45-876157b33dd6" (UID: "ada12bf8-e0a1-46dd-9a45-876157b33dd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:07.780625 master-0 kubenswrapper[13046]: I0308 03:37:07.777789 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47992c73-c637-40e5-955f-9738ece43dc5","Type":"ContainerStarted","Data":"94ded341ba54acdbfc4061099c0ba43a960169a1a0e9bbeb4ad294707b546744"} Mar 08 03:37:07.780625 master-0 kubenswrapper[13046]: I0308 03:37:07.777863 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47992c73-c637-40e5-955f-9738ece43dc5","Type":"ContainerStarted","Data":"0ff4b3fea5609f7045640afd02f6ea3e0ce4f31eb14fd4c21c5f97cb4b75c686"} Mar 08 03:37:07.780625 master-0 kubenswrapper[13046]: I0308 03:37:07.779385 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 03:37:07.791549 master-0 kubenswrapper[13046]: I0308 03:37:07.790624 13046 generic.go:334] "Generic (PLEG): container finished" podID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerID="780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60" exitCode=0 Mar 08 03:37:07.791549 master-0 kubenswrapper[13046]: I0308 03:37:07.790765 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" Mar 08 03:37:07.791549 master-0 kubenswrapper[13046]: I0308 03:37:07.791103 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerDied","Data":"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60"} Mar 08 03:37:07.791549 master-0 kubenswrapper[13046]: I0308 03:37:07.791156 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-qp5cc" event={"ID":"ada12bf8-e0a1-46dd-9a45-876157b33dd6","Type":"ContainerDied","Data":"11f236aec72b8ea3e67cc12f65ec1a75f24fef77706f381f990adc7cf964b6c4"} Mar 08 03:37:07.791549 master-0 kubenswrapper[13046]: I0308 03:37:07.791174 13046 scope.go:117] "RemoveContainer" containerID="780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60" Mar 08 03:37:07.795228 master-0 kubenswrapper[13046]: I0308 03:37:07.794970 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:07.795228 master-0 kubenswrapper[13046]: I0308 03:37:07.794995 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:07.795228 master-0 kubenswrapper[13046]: I0308 03:37:07.795003 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:07.795228 master-0 kubenswrapper[13046]: I0308 03:37:07.795013 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ada12bf8-e0a1-46dd-9a45-876157b33dd6-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:07.797304 master-0 kubenswrapper[13046]: I0308 03:37:07.796579 13046 generic.go:334] "Generic (PLEG): container finished" podID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerID="35359c1f2956bed687b83a202dacc2edd597fb6dc3bdc1af75d3e9fcc13607ff" exitCode=0 Mar 08 03:37:07.798872 master-0 kubenswrapper[13046]: I0308 03:37:07.798794 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" event={"ID":"e26fb299-b4b3-4f84-acb9-82afd62a9c39","Type":"ContainerDied","Data":"35359c1f2956bed687b83a202dacc2edd597fb6dc3bdc1af75d3e9fcc13607ff"} Mar 08 03:37:07.798991 master-0 kubenswrapper[13046]: I0308 03:37:07.798872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" event={"ID":"e26fb299-b4b3-4f84-acb9-82afd62a9c39","Type":"ContainerStarted","Data":"b2569c51cfa7ef60b635cd8aec00a49b1f725894db0715619719c044df012241"} Mar 08 03:37:07.834029 master-0 kubenswrapper[13046]: I0308 03:37:07.833515 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.197427019 podStartE2EDuration="4.833493407s" podCreationTimestamp="2026-03-08 03:37:03 +0000 UTC" firstStartedPulling="2026-03-08 03:37:05.442336075 +0000 UTC m=+1427.521103292" lastFinishedPulling="2026-03-08 03:37:07.078402453 +0000 UTC m=+1429.157169680" observedRunningTime="2026-03-08 03:37:07.809109896 +0000 UTC m=+1429.887877113" watchObservedRunningTime="2026-03-08 03:37:07.833493407 +0000 UTC m=+1429.912260624" Mar 08 03:37:07.836470 master-0 kubenswrapper[13046]: I0308 03:37:07.836290 13046 scope.go:117] "RemoveContainer" containerID="7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3" Mar 08 03:37:07.838150 master-0 kubenswrapper[13046]: I0308 03:37:07.837684 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 03:37:07.873265 master-0 kubenswrapper[13046]: I0308 03:37:07.873225 13046 scope.go:117] "RemoveContainer" containerID="780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60" Mar 08 03:37:07.873834 master-0 kubenswrapper[13046]: E0308 03:37:07.873807 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60\": container with ID starting with 780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60 not found: ID does not exist" containerID="780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60" Mar 08 03:37:07.873964 master-0 kubenswrapper[13046]: I0308 03:37:07.873935 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60"} err="failed to get container status \"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60\": rpc error: code = NotFound desc = could not find container \"780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60\": container with ID starting with 780e280c1a368de7db913673177bce1a6ec4fcbf835b6cbe734eab457f1f6b60 not found: ID does not exist" Mar 08 03:37:07.874066 master-0 kubenswrapper[13046]: I0308 03:37:07.874050 13046 scope.go:117] "RemoveContainer" containerID="7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3" Mar 08 03:37:07.874559 master-0 kubenswrapper[13046]: E0308 03:37:07.874532 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3\": container with ID starting with 7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3 not found: ID does not exist" containerID="7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3" Mar 08 03:37:07.874699 master-0 kubenswrapper[13046]: I0308 03:37:07.874675 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3"} err="failed to get container status \"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3\": rpc error: code = NotFound desc = could not find container \"7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3\": container with ID starting with 7fef0d4d369df7e78235177edf03d717a25196a08fd46b6880b2ce951e998fc3 not found: ID does not exist" Mar 08 03:37:07.889593 master-0 kubenswrapper[13046]: I0308 03:37:07.889541 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:07.900872 master-0 kubenswrapper[13046]: I0308 03:37:07.900796 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-qp5cc"] Mar 08 03:37:08.138740 master-0 kubenswrapper[13046]: I0308 03:37:08.138630 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" path="/var/lib/kubelet/pods/ada12bf8-e0a1-46dd-9a45-876157b33dd6/volumes" Mar 08 03:37:08.451352 master-0 kubenswrapper[13046]: I0308 03:37:08.451240 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9dc808e4-3438-43cc-ae1d-5305ced8e3b9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0893f9fd-26b1-4ab8-ab8b-5d1166264044\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:08.537992 master-0 kubenswrapper[13046]: I0308 03:37:08.537928 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tbd5b"] Mar 08 03:37:08.538724 master-0 kubenswrapper[13046]: E0308 03:37:08.538329 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="dnsmasq-dns" Mar 08 03:37:08.538724 master-0 kubenswrapper[13046]: I0308 03:37:08.538346 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="dnsmasq-dns" Mar 08 03:37:08.538724 master-0 kubenswrapper[13046]: E0308 03:37:08.538373 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="init" Mar 08 03:37:08.538724 master-0 kubenswrapper[13046]: I0308 03:37:08.538380 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="init" Mar 08 03:37:08.538724 master-0 kubenswrapper[13046]: I0308 03:37:08.538604 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada12bf8-e0a1-46dd-9a45-876157b33dd6" containerName="dnsmasq-dns" Mar 08 03:37:08.539232 master-0 kubenswrapper[13046]: I0308 03:37:08.539189 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.541351 master-0 kubenswrapper[13046]: I0308 03:37:08.541308 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 03:37:08.541468 master-0 kubenswrapper[13046]: I0308 03:37:08.541381 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 03:37:08.541642 master-0 kubenswrapper[13046]: I0308 03:37:08.541608 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 03:37:08.570522 master-0 kubenswrapper[13046]: I0308 03:37:08.570443 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tbd5b"] Mar 08 03:37:08.611648 master-0 kubenswrapper[13046]: I0308 03:37:08.611559 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5h7\" (UniqueName: \"kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.611883 master-0 kubenswrapper[13046]: I0308 03:37:08.611736 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:08.611883 master-0 kubenswrapper[13046]: I0308 03:37:08.611840 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.611883 master-0 kubenswrapper[13046]: I0308 03:37:08.611869 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.612022 master-0 kubenswrapper[13046]: I0308 03:37:08.611927 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.612022 master-0 kubenswrapper[13046]: I0308 03:37:08.611954 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.612022 master-0 kubenswrapper[13046]: I0308 03:37:08.611974 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.612022 master-0 kubenswrapper[13046]: I0308 03:37:08.611995 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.612192 master-0 kubenswrapper[13046]: E0308 03:37:08.612174 13046 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 03:37:08.612192 master-0 kubenswrapper[13046]: E0308 03:37:08.612191 13046 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 03:37:08.612279 master-0 kubenswrapper[13046]: E0308 03:37:08.612235 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift podName:bdcf318b-3d3e-42da-ae4b-39c6a17f8437 nodeName:}" failed. No retries permitted until 2026-03-08 03:37:10.612217021 +0000 UTC m=+1432.690984248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift") pod "swift-storage-0" (UID: "bdcf318b-3d3e-42da-ae4b-39c6a17f8437") : configmap "swift-ring-files" not found Mar 08 03:37:08.714072 master-0 kubenswrapper[13046]: I0308 03:37:08.713922 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714072 master-0 kubenswrapper[13046]: I0308 03:37:08.713989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714072 master-0 kubenswrapper[13046]: I0308 03:37:08.714059 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714836 master-0 kubenswrapper[13046]: I0308 03:37:08.714086 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714836 master-0 kubenswrapper[13046]: I0308 03:37:08.714110 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714836 master-0 kubenswrapper[13046]: I0308 03:37:08.714135 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714836 master-0 kubenswrapper[13046]: I0308 03:37:08.714185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5h7\" (UniqueName: \"kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.714836 master-0 kubenswrapper[13046]: I0308 03:37:08.714417 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.715124 master-0 kubenswrapper[13046]: I0308 03:37:08.715032 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.715540 master-0 kubenswrapper[13046]: I0308 03:37:08.715510 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.718444 master-0 kubenswrapper[13046]: I0308 03:37:08.718376 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.719095 master-0 kubenswrapper[13046]: I0308 03:37:08.719053 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.719238 master-0 kubenswrapper[13046]: I0308 03:37:08.719204 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.743238 master-0 kubenswrapper[13046]: I0308 03:37:08.743189 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5h7\" (UniqueName: \"kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7\") pod \"swift-ring-rebalance-tbd5b\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:08.809693 master-0 kubenswrapper[13046]: I0308 03:37:08.809628 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" event={"ID":"e26fb299-b4b3-4f84-acb9-82afd62a9c39","Type":"ContainerStarted","Data":"18ca1a76436c4966da4a1dd5931cb3c3ecfdab86f06b2055bde8316c3eaa330c"} Mar 08 03:37:08.811704 master-0 kubenswrapper[13046]: I0308 03:37:08.811636 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:08.840817 master-0 kubenswrapper[13046]: I0308 03:37:08.840705 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" podStartSLOduration=3.840690648 podStartE2EDuration="3.840690648s" podCreationTimestamp="2026-03-08 03:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:08.836433247 +0000 UTC m=+1430.915200504" watchObservedRunningTime="2026-03-08 03:37:08.840690648 +0000 UTC m=+1430.919457865" Mar 08 03:37:08.876178 master-0 kubenswrapper[13046]: I0308 03:37:08.876132 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:09.448539 master-0 kubenswrapper[13046]: I0308 03:37:09.442599 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tbd5b"] Mar 08 03:37:09.454929 master-0 kubenswrapper[13046]: W0308 03:37:09.454848 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf646560d_325d_41dc_ac99_a36f08ba0149.slice/crio-f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6 WatchSource:0}: Error finding container f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6: Status 404 returned error can't find the container with id f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6 Mar 08 03:37:09.823466 master-0 kubenswrapper[13046]: I0308 03:37:09.823283 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbd5b" event={"ID":"f646560d-325d-41dc-ac99-a36f08ba0149","Type":"ContainerStarted","Data":"f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6"} Mar 08 03:37:10.462121 master-0 kubenswrapper[13046]: I0308 03:37:10.462056 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mtgtj"] Mar 08 03:37:10.480652 master-0 kubenswrapper[13046]: I0308 03:37:10.479349 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mtgtj"] Mar 08 03:37:10.480652 master-0 kubenswrapper[13046]: I0308 03:37:10.479477 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.486478 master-0 kubenswrapper[13046]: I0308 03:37:10.486446 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 03:37:10.566985 master-0 kubenswrapper[13046]: I0308 03:37:10.566904 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.567313 master-0 kubenswrapper[13046]: I0308 03:37:10.567013 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: I0308 03:37:10.669006 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: I0308 03:37:10.669070 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: I0308 03:37:10.669100 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: E0308 03:37:10.669402 13046 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: E0308 03:37:10.669434 13046 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 03:37:10.669897 master-0 kubenswrapper[13046]: E0308 03:37:10.669472 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift podName:bdcf318b-3d3e-42da-ae4b-39c6a17f8437 nodeName:}" failed. No retries permitted until 2026-03-08 03:37:14.669458118 +0000 UTC m=+1436.748225335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift") pod "swift-storage-0" (UID: "bdcf318b-3d3e-42da-ae4b-39c6a17f8437") : configmap "swift-ring-files" not found Mar 08 03:37:10.673145 master-0 kubenswrapper[13046]: I0308 03:37:10.670886 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.694624 master-0 kubenswrapper[13046]: I0308 03:37:10.692034 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn\") pod \"root-account-create-update-mtgtj\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:10.840966 master-0 kubenswrapper[13046]: I0308 03:37:10.840022 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:11.569865 master-0 kubenswrapper[13046]: I0308 03:37:11.568848 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mtgtj"] Mar 08 03:37:11.793507 master-0 kubenswrapper[13046]: I0308 03:37:11.792779 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jj9pp"] Mar 08 03:37:11.797502 master-0 kubenswrapper[13046]: I0308 03:37:11.795280 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:11.809695 master-0 kubenswrapper[13046]: I0308 03:37:11.809565 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jj9pp"] Mar 08 03:37:11.901175 master-0 kubenswrapper[13046]: I0308 03:37:11.901111 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-95b6-account-create-update-jlsmg"] Mar 08 03:37:11.902572 master-0 kubenswrapper[13046]: I0308 03:37:11.902533 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:11.904379 master-0 kubenswrapper[13046]: I0308 03:37:11.904350 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 03:37:11.906206 master-0 kubenswrapper[13046]: I0308 03:37:11.906164 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv295\" (UniqueName: \"kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:11.906310 master-0 kubenswrapper[13046]: I0308 03:37:11.906260 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:11.929470 master-0 kubenswrapper[13046]: I0308 03:37:11.915421 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95b6-account-create-update-jlsmg"] Mar 08 03:37:11.996491 master-0 kubenswrapper[13046]: I0308 03:37:11.996435 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xbkt2"] Mar 08 03:37:11.997821 master-0 kubenswrapper[13046]: I0308 03:37:11.997793 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.007864 master-0 kubenswrapper[13046]: I0308 03:37:12.007756 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xbkt2"] Mar 08 03:37:12.034600 master-0 kubenswrapper[13046]: I0308 03:37:12.033745 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv295\" (UniqueName: \"kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:12.034600 master-0 kubenswrapper[13046]: I0308 03:37:12.034578 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.034839 master-0 kubenswrapper[13046]: I0308 03:37:12.034723 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjdk\" (UniqueName: \"kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.034886 master-0 kubenswrapper[13046]: I0308 03:37:12.034836 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:12.036761 master-0 kubenswrapper[13046]: I0308 03:37:12.036722 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:12.084309 master-0 kubenswrapper[13046]: I0308 03:37:12.084195 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv295\" (UniqueName: \"kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295\") pod \"glance-db-create-jj9pp\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:12.091285 master-0 kubenswrapper[13046]: I0308 03:37:12.091233 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c412-account-create-update-zvl4c"] Mar 08 03:37:12.094496 master-0 kubenswrapper[13046]: I0308 03:37:12.092471 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.094496 master-0 kubenswrapper[13046]: I0308 03:37:12.094295 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 03:37:12.117810 master-0 kubenswrapper[13046]: I0308 03:37:12.117550 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c412-account-create-update-zvl4c"] Mar 08 03:37:12.137787 master-0 kubenswrapper[13046]: I0308 03:37:12.137746 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:12.140609 master-0 kubenswrapper[13046]: I0308 03:37:12.139416 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.140609 master-0 kubenswrapper[13046]: I0308 03:37:12.139686 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8l8\" (UniqueName: \"kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.140609 master-0 kubenswrapper[13046]: I0308 03:37:12.140293 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.140609 master-0 kubenswrapper[13046]: I0308 03:37:12.140336 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjdk\" (UniqueName: \"kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.141472 master-0 kubenswrapper[13046]: I0308 03:37:12.141430 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.160190 master-0 kubenswrapper[13046]: I0308 03:37:12.160137 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjdk\" (UniqueName: \"kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk\") pod \"glance-95b6-account-create-update-jlsmg\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.241963 master-0 kubenswrapper[13046]: I0308 03:37:12.241897 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.242192 master-0 kubenswrapper[13046]: I0308 03:37:12.242019 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.242192 master-0 kubenswrapper[13046]: I0308 03:37:12.242097 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkbc\" (UniqueName: \"kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.242192 master-0 kubenswrapper[13046]: I0308 03:37:12.242120 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8l8\" (UniqueName: \"kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.244151 master-0 kubenswrapper[13046]: I0308 03:37:12.244089 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.252415 master-0 kubenswrapper[13046]: I0308 03:37:12.252369 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kpzhp"] Mar 08 03:37:12.253580 master-0 kubenswrapper[13046]: I0308 03:37:12.253547 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.276002 master-0 kubenswrapper[13046]: I0308 03:37:12.264866 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8l8\" (UniqueName: \"kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8\") pod \"keystone-db-create-xbkt2\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.276002 master-0 kubenswrapper[13046]: I0308 03:37:12.273121 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:12.283658 master-0 kubenswrapper[13046]: I0308 03:37:12.283600 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kpzhp"] Mar 08 03:37:12.345061 master-0 kubenswrapper[13046]: I0308 03:37:12.344988 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkbc\" (UniqueName: \"kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.345322 master-0 kubenswrapper[13046]: I0308 03:37:12.345178 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.346191 master-0 kubenswrapper[13046]: I0308 03:37:12.346152 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.348553 master-0 kubenswrapper[13046]: I0308 03:37:12.348460 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:12.370084 master-0 kubenswrapper[13046]: I0308 03:37:12.370013 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkbc\" (UniqueName: \"kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc\") pod \"keystone-c412-account-create-update-zvl4c\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.418448 master-0 kubenswrapper[13046]: I0308 03:37:12.394414 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-eb58-account-create-update-l7wpw"] Mar 08 03:37:12.418448 master-0 kubenswrapper[13046]: I0308 03:37:12.396168 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.418448 master-0 kubenswrapper[13046]: I0308 03:37:12.398829 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 03:37:12.418448 master-0 kubenswrapper[13046]: I0308 03:37:12.406964 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb58-account-create-update-l7wpw"] Mar 08 03:37:12.443862 master-0 kubenswrapper[13046]: I0308 03:37:12.443768 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:12.447725 master-0 kubenswrapper[13046]: I0308 03:37:12.447640 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkml\" (UniqueName: \"kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.447825 master-0 kubenswrapper[13046]: I0308 03:37:12.447738 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.550864 master-0 kubenswrapper[13046]: I0308 03:37:12.550777 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkml\" (UniqueName: \"kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.551103 master-0 kubenswrapper[13046]: I0308 03:37:12.550906 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.551103 master-0 kubenswrapper[13046]: I0308 03:37:12.551059 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.551201 master-0 kubenswrapper[13046]: I0308 03:37:12.551110 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.552403 master-0 kubenswrapper[13046]: I0308 03:37:12.552371 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.569034 master-0 kubenswrapper[13046]: I0308 03:37:12.568913 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkml\" (UniqueName: \"kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml\") pod \"placement-db-create-kpzhp\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.571703 master-0 kubenswrapper[13046]: W0308 03:37:12.571635 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8a15b78_56e3_49f1_985a_683f0f3ffde9.slice/crio-9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac WatchSource:0}: Error finding container 9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac: Status 404 returned error can't find the container with id 9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac Mar 08 03:37:12.619468 master-0 kubenswrapper[13046]: I0308 03:37:12.619275 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:12.653327 master-0 kubenswrapper[13046]: I0308 03:37:12.653228 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.653645 master-0 kubenswrapper[13046]: I0308 03:37:12.653425 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.654303 master-0 kubenswrapper[13046]: I0308 03:37:12.654249 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.671539 master-0 kubenswrapper[13046]: I0308 03:37:12.671450 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9\") pod \"placement-eb58-account-create-update-l7wpw\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.718659 master-0 kubenswrapper[13046]: I0308 03:37:12.718567 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:12.905643 master-0 kubenswrapper[13046]: I0308 03:37:12.903441 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mtgtj" event={"ID":"c8a15b78-56e3-49f1-985a-683f0f3ffde9","Type":"ContainerStarted","Data":"9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac"} Mar 08 03:37:14.141752 master-0 kubenswrapper[13046]: W0308 03:37:14.141707 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda0388d7_fdd0_4f0b_9614_8122eb3258ec.slice/crio-6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef WatchSource:0}: Error finding container 6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef: Status 404 returned error can't find the container with id 6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef Mar 08 03:37:14.144601 master-0 kubenswrapper[13046]: I0308 03:37:14.144132 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jj9pp"] Mar 08 03:37:14.596353 master-0 kubenswrapper[13046]: I0308 03:37:14.596315 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-95b6-account-create-update-jlsmg"] Mar 08 03:37:14.606972 master-0 kubenswrapper[13046]: I0308 03:37:14.606941 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xbkt2"] Mar 08 03:37:14.615599 master-0 kubenswrapper[13046]: W0308 03:37:14.614662 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d3025d7_d6e2_42c7_8352_ec8199a2e9ee.slice/crio-99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7 WatchSource:0}: Error finding container 99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7: Status 404 returned error can't find the container with id 99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7 Mar 08 03:37:14.619621 master-0 kubenswrapper[13046]: I0308 03:37:14.619577 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb58-account-create-update-l7wpw"] Mar 08 03:37:14.632519 master-0 kubenswrapper[13046]: W0308 03:37:14.631678 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd02bd09c_8e6b_40b9_967d_d93b3621ae5b.slice/crio-5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d WatchSource:0}: Error finding container 5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d: Status 404 returned error can't find the container with id 5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d Mar 08 03:37:14.632778 master-0 kubenswrapper[13046]: I0308 03:37:14.632582 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c412-account-create-update-zvl4c"] Mar 08 03:37:14.705785 master-0 kubenswrapper[13046]: I0308 03:37:14.705048 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:14.705785 master-0 kubenswrapper[13046]: E0308 03:37:14.705267 13046 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 03:37:14.705785 master-0 kubenswrapper[13046]: E0308 03:37:14.705301 13046 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 03:37:14.705785 master-0 kubenswrapper[13046]: E0308 03:37:14.705370 13046 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift podName:bdcf318b-3d3e-42da-ae4b-39c6a17f8437 nodeName:}" failed. No retries permitted until 2026-03-08 03:37:22.705350132 +0000 UTC m=+1444.784117369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift") pod "swift-storage-0" (UID: "bdcf318b-3d3e-42da-ae4b-39c6a17f8437") : configmap "swift-ring-files" not found Mar 08 03:37:14.726089 master-0 kubenswrapper[13046]: I0308 03:37:14.723726 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kpzhp"] Mar 08 03:37:14.960744 master-0 kubenswrapper[13046]: I0308 03:37:14.960157 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb58-account-create-update-l7wpw" event={"ID":"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee","Type":"ContainerStarted","Data":"b3e824a99970d6cdb570b7aa177700f6929d08f94a4e2b673868af8bfb28b0fb"} Mar 08 03:37:14.960744 master-0 kubenswrapper[13046]: I0308 03:37:14.960728 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb58-account-create-update-l7wpw" event={"ID":"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee","Type":"ContainerStarted","Data":"99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7"} Mar 08 03:37:14.973046 master-0 kubenswrapper[13046]: I0308 03:37:14.972990 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kpzhp" event={"ID":"2cd2c544-a565-4f70-978b-667ba7c35a57","Type":"ContainerStarted","Data":"d09b336517509173760f52369cf1e7d4628ddf5119ed182df491250aebe0a262"} Mar 08 03:37:14.987136 master-0 kubenswrapper[13046]: I0308 03:37:14.986182 13046 generic.go:334] "Generic (PLEG): container finished" podID="c8a15b78-56e3-49f1-985a-683f0f3ffde9" containerID="48cd038ded94fcd5f505dbc354f100a71f9b275b1fe4afa1e68a9b25d76dfa36" exitCode=0 Mar 08 03:37:14.987136 master-0 kubenswrapper[13046]: I0308 03:37:14.986272 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mtgtj" event={"ID":"c8a15b78-56e3-49f1-985a-683f0f3ffde9","Type":"ContainerDied","Data":"48cd038ded94fcd5f505dbc354f100a71f9b275b1fe4afa1e68a9b25d76dfa36"} Mar 08 03:37:14.991358 master-0 kubenswrapper[13046]: I0308 03:37:14.991288 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-eb58-account-create-update-l7wpw" podStartSLOduration=2.991263987 podStartE2EDuration="2.991263987s" podCreationTimestamp="2026-03-08 03:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:14.984763843 +0000 UTC m=+1437.063531060" watchObservedRunningTime="2026-03-08 03:37:14.991263987 +0000 UTC m=+1437.070031214" Mar 08 03:37:14.995311 master-0 kubenswrapper[13046]: I0308 03:37:14.995260 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xbkt2" event={"ID":"d02bd09c-8e6b-40b9-967d-d93b3621ae5b","Type":"ContainerStarted","Data":"e9b00a230484d6acafff4a84213a2da26448c6cc415e76f80d1e3c74e1038f75"} Mar 08 03:37:14.995462 master-0 kubenswrapper[13046]: I0308 03:37:14.995329 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xbkt2" event={"ID":"d02bd09c-8e6b-40b9-967d-d93b3621ae5b","Type":"ContainerStarted","Data":"5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d"} Mar 08 03:37:15.000250 master-0 kubenswrapper[13046]: I0308 03:37:15.000195 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95b6-account-create-update-jlsmg" event={"ID":"17dd1ec6-c2b0-46a9-b162-6efee4a883b9","Type":"ContainerStarted","Data":"6f81c42c2ed92c30cdaf63e460e31b7e1bfa1a76b96f7e6f849b345fdb1766cc"} Mar 08 03:37:15.000250 master-0 kubenswrapper[13046]: I0308 03:37:15.000248 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95b6-account-create-update-jlsmg" event={"ID":"17dd1ec6-c2b0-46a9-b162-6efee4a883b9","Type":"ContainerStarted","Data":"887100102f886110a5355bcc6ff38ef311312fde42c769513cc2b24e2720259f"} Mar 08 03:37:15.003505 master-0 kubenswrapper[13046]: I0308 03:37:15.003455 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbd5b" event={"ID":"f646560d-325d-41dc-ac99-a36f08ba0149","Type":"ContainerStarted","Data":"d9940db9a3f723e912a488c34cf3ca80600986c5b2cbcc09c58cacc4e7000330"} Mar 08 03:37:15.005323 master-0 kubenswrapper[13046]: I0308 03:37:15.005300 13046 generic.go:334] "Generic (PLEG): container finished" podID="da0388d7-fdd0-4f0b-9614-8122eb3258ec" containerID="461e31be4361e50fdbfc062802513f65ec890b82ef96ddb4f2e1d4b1a514eed0" exitCode=0 Mar 08 03:37:15.005442 master-0 kubenswrapper[13046]: I0308 03:37:15.005347 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jj9pp" event={"ID":"da0388d7-fdd0-4f0b-9614-8122eb3258ec","Type":"ContainerDied","Data":"461e31be4361e50fdbfc062802513f65ec890b82ef96ddb4f2e1d4b1a514eed0"} Mar 08 03:37:15.005549 master-0 kubenswrapper[13046]: I0308 03:37:15.005534 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jj9pp" event={"ID":"da0388d7-fdd0-4f0b-9614-8122eb3258ec","Type":"ContainerStarted","Data":"6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef"} Mar 08 03:37:15.009944 master-0 kubenswrapper[13046]: I0308 03:37:15.009893 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c412-account-create-update-zvl4c" event={"ID":"95995527-8076-43fb-8a0e-a9678030ad5e","Type":"ContainerStarted","Data":"824af6069fd162c64f76b33e7938589cd758c8885c13b07192e47f94f1b439d2"} Mar 08 03:37:15.010123 master-0 kubenswrapper[13046]: I0308 03:37:15.009983 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c412-account-create-update-zvl4c" event={"ID":"95995527-8076-43fb-8a0e-a9678030ad5e","Type":"ContainerStarted","Data":"7e57ab3666c05bf0a4b5d133dcaa9aab486dcf420ced4a45bb3444259e178d88"} Mar 08 03:37:15.038549 master-0 kubenswrapper[13046]: I0308 03:37:15.031310 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c412-account-create-update-zvl4c" podStartSLOduration=3.031292412 podStartE2EDuration="3.031292412s" podCreationTimestamp="2026-03-08 03:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:15.027842134 +0000 UTC m=+1437.106609351" watchObservedRunningTime="2026-03-08 03:37:15.031292412 +0000 UTC m=+1437.110059629" Mar 08 03:37:15.083321 master-0 kubenswrapper[13046]: I0308 03:37:15.083239 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tbd5b" podStartSLOduration=2.83041687 podStartE2EDuration="7.083203393s" podCreationTimestamp="2026-03-08 03:37:08 +0000 UTC" firstStartedPulling="2026-03-08 03:37:09.461902757 +0000 UTC m=+1431.540669974" lastFinishedPulling="2026-03-08 03:37:13.71468928 +0000 UTC m=+1435.793456497" observedRunningTime="2026-03-08 03:37:15.075248148 +0000 UTC m=+1437.154015365" watchObservedRunningTime="2026-03-08 03:37:15.083203393 +0000 UTC m=+1437.161970610" Mar 08 03:37:15.114286 master-0 kubenswrapper[13046]: I0308 03:37:15.114201 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-95b6-account-create-update-jlsmg" podStartSLOduration=4.114176531 podStartE2EDuration="4.114176531s" podCreationTimestamp="2026-03-08 03:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:15.094688959 +0000 UTC m=+1437.173456176" watchObservedRunningTime="2026-03-08 03:37:15.114176531 +0000 UTC m=+1437.192943768" Mar 08 03:37:15.132877 master-0 kubenswrapper[13046]: I0308 03:37:15.131867 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-xbkt2" podStartSLOduration=4.131843182 podStartE2EDuration="4.131843182s" podCreationTimestamp="2026-03-08 03:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:15.109686064 +0000 UTC m=+1437.188453281" watchObservedRunningTime="2026-03-08 03:37:15.131843182 +0000 UTC m=+1437.210610409" Mar 08 03:37:15.908815 master-0 kubenswrapper[13046]: I0308 03:37:15.908752 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:16.002356 master-0 kubenswrapper[13046]: I0308 03:37:16.002310 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:37:16.002580 master-0 kubenswrapper[13046]: I0308 03:37:16.002530 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="dnsmasq-dns" containerID="cri-o://3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb" gracePeriod=10 Mar 08 03:37:16.085293 master-0 kubenswrapper[13046]: I0308 03:37:16.085244 13046 generic.go:334] "Generic (PLEG): container finished" podID="d02bd09c-8e6b-40b9-967d-d93b3621ae5b" containerID="e9b00a230484d6acafff4a84213a2da26448c6cc415e76f80d1e3c74e1038f75" exitCode=0 Mar 08 03:37:16.085577 master-0 kubenswrapper[13046]: I0308 03:37:16.085291 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xbkt2" event={"ID":"d02bd09c-8e6b-40b9-967d-d93b3621ae5b","Type":"ContainerDied","Data":"e9b00a230484d6acafff4a84213a2da26448c6cc415e76f80d1e3c74e1038f75"} Mar 08 03:37:16.087343 master-0 kubenswrapper[13046]: I0308 03:37:16.087291 13046 generic.go:334] "Generic (PLEG): container finished" podID="17dd1ec6-c2b0-46a9-b162-6efee4a883b9" containerID="6f81c42c2ed92c30cdaf63e460e31b7e1bfa1a76b96f7e6f849b345fdb1766cc" exitCode=0 Mar 08 03:37:16.087415 master-0 kubenswrapper[13046]: I0308 03:37:16.087380 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95b6-account-create-update-jlsmg" event={"ID":"17dd1ec6-c2b0-46a9-b162-6efee4a883b9","Type":"ContainerDied","Data":"6f81c42c2ed92c30cdaf63e460e31b7e1bfa1a76b96f7e6f849b345fdb1766cc"} Mar 08 03:37:16.091995 master-0 kubenswrapper[13046]: I0308 03:37:16.091938 13046 generic.go:334] "Generic (PLEG): container finished" podID="95995527-8076-43fb-8a0e-a9678030ad5e" containerID="824af6069fd162c64f76b33e7938589cd758c8885c13b07192e47f94f1b439d2" exitCode=0 Mar 08 03:37:16.092094 master-0 kubenswrapper[13046]: I0308 03:37:16.092059 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c412-account-create-update-zvl4c" event={"ID":"95995527-8076-43fb-8a0e-a9678030ad5e","Type":"ContainerDied","Data":"824af6069fd162c64f76b33e7938589cd758c8885c13b07192e47f94f1b439d2"} Mar 08 03:37:16.093957 master-0 kubenswrapper[13046]: I0308 03:37:16.093923 13046 generic.go:334] "Generic (PLEG): container finished" podID="3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" containerID="b3e824a99970d6cdb570b7aa177700f6929d08f94a4e2b673868af8bfb28b0fb" exitCode=0 Mar 08 03:37:16.094026 master-0 kubenswrapper[13046]: I0308 03:37:16.093968 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb58-account-create-update-l7wpw" event={"ID":"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee","Type":"ContainerDied","Data":"b3e824a99970d6cdb570b7aa177700f6929d08f94a4e2b673868af8bfb28b0fb"} Mar 08 03:37:16.096034 master-0 kubenswrapper[13046]: I0308 03:37:16.095697 13046 generic.go:334] "Generic (PLEG): container finished" podID="2cd2c544-a565-4f70-978b-667ba7c35a57" containerID="5654f00d64188293757e1c9ef3cdccdf33780f01db9ecdd3d48ebddd0c502803" exitCode=0 Mar 08 03:37:16.096034 master-0 kubenswrapper[13046]: I0308 03:37:16.095880 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kpzhp" event={"ID":"2cd2c544-a565-4f70-978b-667ba7c35a57","Type":"ContainerDied","Data":"5654f00d64188293757e1c9ef3cdccdf33780f01db9ecdd3d48ebddd0c502803"} Mar 08 03:37:16.762147 master-0 kubenswrapper[13046]: I0308 03:37:16.761968 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:16.815108 master-0 kubenswrapper[13046]: I0308 03:37:16.815034 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv295\" (UniqueName: \"kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295\") pod \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " Mar 08 03:37:16.815386 master-0 kubenswrapper[13046]: I0308 03:37:16.815155 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts\") pod \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\" (UID: \"da0388d7-fdd0-4f0b-9614-8122eb3258ec\") " Mar 08 03:37:16.816782 master-0 kubenswrapper[13046]: I0308 03:37:16.816750 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da0388d7-fdd0-4f0b-9614-8122eb3258ec" (UID: "da0388d7-fdd0-4f0b-9614-8122eb3258ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:16.820573 master-0 kubenswrapper[13046]: I0308 03:37:16.819657 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295" (OuterVolumeSpecName: "kube-api-access-bv295") pod "da0388d7-fdd0-4f0b-9614-8122eb3258ec" (UID: "da0388d7-fdd0-4f0b-9614-8122eb3258ec"). InnerVolumeSpecName "kube-api-access-bv295". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:16.915433 master-0 kubenswrapper[13046]: I0308 03:37:16.901168 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:37:16.921757 master-0 kubenswrapper[13046]: I0308 03:37:16.921713 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:16.922649 master-0 kubenswrapper[13046]: I0308 03:37:16.922607 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv295\" (UniqueName: \"kubernetes.io/projected/da0388d7-fdd0-4f0b-9614-8122eb3258ec-kube-api-access-bv295\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:16.922787 master-0 kubenswrapper[13046]: I0308 03:37:16.922651 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0388d7-fdd0-4f0b-9614-8122eb3258ec-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.023559 master-0 kubenswrapper[13046]: I0308 03:37:17.023473 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts\") pod \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " Mar 08 03:37:17.023933 master-0 kubenswrapper[13046]: I0308 03:37:17.023623 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn\") pod \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\" (UID: \"c8a15b78-56e3-49f1-985a-683f0f3ffde9\") " Mar 08 03:37:17.023933 master-0 kubenswrapper[13046]: I0308 03:37:17.023691 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8h7r\" (UniqueName: \"kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r\") pod \"38c2671c-0337-4af7-8a29-eef713b62f67\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " Mar 08 03:37:17.023933 master-0 kubenswrapper[13046]: I0308 03:37:17.023811 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config\") pod \"38c2671c-0337-4af7-8a29-eef713b62f67\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " Mar 08 03:37:17.023933 master-0 kubenswrapper[13046]: I0308 03:37:17.023875 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc\") pod \"38c2671c-0337-4af7-8a29-eef713b62f67\" (UID: \"38c2671c-0337-4af7-8a29-eef713b62f67\") " Mar 08 03:37:17.024331 master-0 kubenswrapper[13046]: I0308 03:37:17.024294 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8a15b78-56e3-49f1-985a-683f0f3ffde9" (UID: "c8a15b78-56e3-49f1-985a-683f0f3ffde9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:17.030310 master-0 kubenswrapper[13046]: I0308 03:37:17.030262 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r" (OuterVolumeSpecName: "kube-api-access-n8h7r") pod "38c2671c-0337-4af7-8a29-eef713b62f67" (UID: "38c2671c-0337-4af7-8a29-eef713b62f67"). InnerVolumeSpecName "kube-api-access-n8h7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:17.030369 master-0 kubenswrapper[13046]: I0308 03:37:17.030327 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn" (OuterVolumeSpecName: "kube-api-access-7rssn") pod "c8a15b78-56e3-49f1-985a-683f0f3ffde9" (UID: "c8a15b78-56e3-49f1-985a-683f0f3ffde9"). InnerVolumeSpecName "kube-api-access-7rssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:17.070160 master-0 kubenswrapper[13046]: I0308 03:37:17.070111 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config" (OuterVolumeSpecName: "config") pod "38c2671c-0337-4af7-8a29-eef713b62f67" (UID: "38c2671c-0337-4af7-8a29-eef713b62f67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:17.080222 master-0 kubenswrapper[13046]: I0308 03:37:17.079876 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38c2671c-0337-4af7-8a29-eef713b62f67" (UID: "38c2671c-0337-4af7-8a29-eef713b62f67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:17.108631 master-0 kubenswrapper[13046]: I0308 03:37:17.108548 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mtgtj" event={"ID":"c8a15b78-56e3-49f1-985a-683f0f3ffde9","Type":"ContainerDied","Data":"9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac"} Mar 08 03:37:17.108631 master-0 kubenswrapper[13046]: I0308 03:37:17.108595 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bec4b309cd39b3df8b9924a0c77d3f1c4eaa2fc05899a1026f8b469f38c64ac" Mar 08 03:37:17.108631 master-0 kubenswrapper[13046]: I0308 03:37:17.108593 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mtgtj" Mar 08 03:37:17.109721 master-0 kubenswrapper[13046]: I0308 03:37:17.109665 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jj9pp" event={"ID":"da0388d7-fdd0-4f0b-9614-8122eb3258ec","Type":"ContainerDied","Data":"6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef"} Mar 08 03:37:17.109721 master-0 kubenswrapper[13046]: I0308 03:37:17.109687 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a533a6eeb5e61add3e01252aff3c9f557f7dc178ed75099748e5cd14ac602ef" Mar 08 03:37:17.109721 master-0 kubenswrapper[13046]: I0308 03:37:17.109705 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jj9pp" Mar 08 03:37:17.112066 master-0 kubenswrapper[13046]: I0308 03:37:17.111988 13046 generic.go:334] "Generic (PLEG): container finished" podID="38c2671c-0337-4af7-8a29-eef713b62f67" containerID="3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb" exitCode=0 Mar 08 03:37:17.112384 master-0 kubenswrapper[13046]: I0308 03:37:17.112311 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" Mar 08 03:37:17.122062 master-0 kubenswrapper[13046]: I0308 03:37:17.117872 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" event={"ID":"38c2671c-0337-4af7-8a29-eef713b62f67","Type":"ContainerDied","Data":"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb"} Mar 08 03:37:17.122062 master-0 kubenswrapper[13046]: I0308 03:37:17.117926 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" event={"ID":"38c2671c-0337-4af7-8a29-eef713b62f67","Type":"ContainerDied","Data":"3ff9e9fcfaa6cf7f7f6de49617e7efb4db57c141e9524bb4c2da72737b9efa08"} Mar 08 03:37:17.122062 master-0 kubenswrapper[13046]: I0308 03:37:17.117947 13046 scope.go:117] "RemoveContainer" containerID="3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb" Mar 08 03:37:17.126865 master-0 kubenswrapper[13046]: I0308 03:37:17.126823 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8h7r\" (UniqueName: \"kubernetes.io/projected/38c2671c-0337-4af7-8a29-eef713b62f67-kube-api-access-n8h7r\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.126949 master-0 kubenswrapper[13046]: I0308 03:37:17.126872 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.126949 master-0 kubenswrapper[13046]: I0308 03:37:17.126887 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38c2671c-0337-4af7-8a29-eef713b62f67-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.126949 master-0 kubenswrapper[13046]: I0308 03:37:17.126900 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a15b78-56e3-49f1-985a-683f0f3ffde9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.126949 master-0 kubenswrapper[13046]: I0308 03:37:17.126912 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rssn\" (UniqueName: \"kubernetes.io/projected/c8a15b78-56e3-49f1-985a-683f0f3ffde9-kube-api-access-7rssn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.164527 master-0 kubenswrapper[13046]: I0308 03:37:17.161524 13046 scope.go:117] "RemoveContainer" containerID="d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11" Mar 08 03:37:17.175343 master-0 kubenswrapper[13046]: I0308 03:37:17.175293 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:37:17.187212 master-0 kubenswrapper[13046]: I0308 03:37:17.184886 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-k9nrf"] Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: I0308 03:37:17.205930 13046 scope.go:117] "RemoveContainer" containerID="3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb" Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: E0308 03:37:17.206347 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb\": container with ID starting with 3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb not found: ID does not exist" containerID="3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb" Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: I0308 03:37:17.206374 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb"} err="failed to get container status \"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb\": rpc error: code = NotFound desc = could not find container \"3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb\": container with ID starting with 3f90880a89b2091803bad246898fd5dfa872d5e0db9091d8aac156ff5bf57dfb not found: ID does not exist" Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: I0308 03:37:17.206397 13046 scope.go:117] "RemoveContainer" containerID="d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11" Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: E0308 03:37:17.206717 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11\": container with ID starting with d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11 not found: ID does not exist" containerID="d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11" Mar 08 03:37:17.214762 master-0 kubenswrapper[13046]: I0308 03:37:17.206733 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11"} err="failed to get container status \"d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11\": rpc error: code = NotFound desc = could not find container \"d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11\": container with ID starting with d4de400d8301c12dcac0fec6fbf694bceb9e5bead70ee4567ec2298072652a11 not found: ID does not exist" Mar 08 03:37:17.665553 master-0 kubenswrapper[13046]: I0308 03:37:17.662551 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:17.771558 master-0 kubenswrapper[13046]: I0308 03:37:17.747198 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts\") pod \"2cd2c544-a565-4f70-978b-667ba7c35a57\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " Mar 08 03:37:17.771558 master-0 kubenswrapper[13046]: I0308 03:37:17.747329 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqkml\" (UniqueName: \"kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml\") pod \"2cd2c544-a565-4f70-978b-667ba7c35a57\" (UID: \"2cd2c544-a565-4f70-978b-667ba7c35a57\") " Mar 08 03:37:17.771558 master-0 kubenswrapper[13046]: I0308 03:37:17.748727 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cd2c544-a565-4f70-978b-667ba7c35a57" (UID: "2cd2c544-a565-4f70-978b-667ba7c35a57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:17.782911 master-0 kubenswrapper[13046]: I0308 03:37:17.775278 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml" (OuterVolumeSpecName: "kube-api-access-bqkml") pod "2cd2c544-a565-4f70-978b-667ba7c35a57" (UID: "2cd2c544-a565-4f70-978b-667ba7c35a57"). InnerVolumeSpecName "kube-api-access-bqkml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:17.849110 master-0 kubenswrapper[13046]: I0308 03:37:17.849073 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cd2c544-a565-4f70-978b-667ba7c35a57-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:17.849360 master-0 kubenswrapper[13046]: I0308 03:37:17.849345 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqkml\" (UniqueName: \"kubernetes.io/projected/2cd2c544-a565-4f70-978b-667ba7c35a57-kube-api-access-bqkml\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.013516 master-0 kubenswrapper[13046]: I0308 03:37:18.013279 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:18.024674 master-0 kubenswrapper[13046]: I0308 03:37:18.020819 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:18.038586 master-0 kubenswrapper[13046]: I0308 03:37:18.038533 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:18.062326 master-0 kubenswrapper[13046]: I0308 03:37:18.062276 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:18.143836 master-0 kubenswrapper[13046]: I0308 03:37:18.143784 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xbkt2" Mar 08 03:37:18.146082 master-0 kubenswrapper[13046]: I0308 03:37:18.146037 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" path="/var/lib/kubelet/pods/38c2671c-0337-4af7-8a29-eef713b62f67/volumes" Mar 08 03:37:18.146916 master-0 kubenswrapper[13046]: I0308 03:37:18.146875 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-95b6-account-create-update-jlsmg" Mar 08 03:37:18.149089 master-0 kubenswrapper[13046]: I0308 03:37:18.149063 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c412-account-create-update-zvl4c" Mar 08 03:37:18.151280 master-0 kubenswrapper[13046]: I0308 03:37:18.151254 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb58-account-create-update-l7wpw" Mar 08 03:37:18.154436 master-0 kubenswrapper[13046]: I0308 03:37:18.154398 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kpzhp" Mar 08 03:37:18.157515 master-0 kubenswrapper[13046]: I0308 03:37:18.157429 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xbkt2" event={"ID":"d02bd09c-8e6b-40b9-967d-d93b3621ae5b","Type":"ContainerDied","Data":"5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d"} Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157478 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5811246e6adf30759431b2b61e17d2a86463585a5ad4592c96ebb9f231c4ee9d" Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157558 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-95b6-account-create-update-jlsmg" event={"ID":"17dd1ec6-c2b0-46a9-b162-6efee4a883b9","Type":"ContainerDied","Data":"887100102f886110a5355bcc6ff38ef311312fde42c769513cc2b24e2720259f"} Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157571 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="887100102f886110a5355bcc6ff38ef311312fde42c769513cc2b24e2720259f" Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157580 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c412-account-create-update-zvl4c" event={"ID":"95995527-8076-43fb-8a0e-a9678030ad5e","Type":"ContainerDied","Data":"7e57ab3666c05bf0a4b5d133dcaa9aab486dcf420ced4a45bb3444259e178d88"} Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157589 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e57ab3666c05bf0a4b5d133dcaa9aab486dcf420ced4a45bb3444259e178d88" Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157598 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb58-account-create-update-l7wpw" event={"ID":"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee","Type":"ContainerDied","Data":"99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7"} Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157609 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d76020f1ed20c91874db97f3a65f0ff7ea63486db1d2593586bd5d3aebbaf7" Mar 08 03:37:18.157611 master-0 kubenswrapper[13046]: I0308 03:37:18.157618 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kpzhp" event={"ID":"2cd2c544-a565-4f70-978b-667ba7c35a57","Type":"ContainerDied","Data":"d09b336517509173760f52369cf1e7d4628ddf5119ed182df491250aebe0a262"} Mar 08 03:37:18.157918 master-0 kubenswrapper[13046]: I0308 03:37:18.157629 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09b336517509173760f52369cf1e7d4628ddf5119ed182df491250aebe0a262" Mar 08 03:37:18.162546 master-0 kubenswrapper[13046]: I0308 03:37:18.162513 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts\") pod \"95995527-8076-43fb-8a0e-a9678030ad5e\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " Mar 08 03:37:18.162656 master-0 kubenswrapper[13046]: I0308 03:37:18.162625 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts\") pod \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " Mar 08 03:37:18.162786 master-0 kubenswrapper[13046]: I0308 03:37:18.162759 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts\") pod \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " Mar 08 03:37:18.162998 master-0 kubenswrapper[13046]: I0308 03:37:18.162971 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkbc\" (UniqueName: \"kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc\") pod \"95995527-8076-43fb-8a0e-a9678030ad5e\" (UID: \"95995527-8076-43fb-8a0e-a9678030ad5e\") " Mar 08 03:37:18.162998 master-0 kubenswrapper[13046]: I0308 03:37:18.162980 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95995527-8076-43fb-8a0e-a9678030ad5e" (UID: "95995527-8076-43fb-8a0e-a9678030ad5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:18.163111 master-0 kubenswrapper[13046]: I0308 03:37:18.163053 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcjdk\" (UniqueName: \"kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk\") pod \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " Mar 08 03:37:18.163157 master-0 kubenswrapper[13046]: I0308 03:37:18.163139 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts\") pod \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\" (UID: \"17dd1ec6-c2b0-46a9-b162-6efee4a883b9\") " Mar 08 03:37:18.163205 master-0 kubenswrapper[13046]: I0308 03:37:18.163160 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k8l8\" (UniqueName: \"kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8\") pod \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\" (UID: \"d02bd09c-8e6b-40b9-967d-d93b3621ae5b\") " Mar 08 03:37:18.163205 master-0 kubenswrapper[13046]: I0308 03:37:18.163201 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9\") pod \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\" (UID: \"3d3025d7-d6e2-42c7-8352-ec8199a2e9ee\") " Mar 08 03:37:18.163396 master-0 kubenswrapper[13046]: I0308 03:37:18.163363 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" (UID: "3d3025d7-d6e2-42c7-8352-ec8199a2e9ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:18.163539 master-0 kubenswrapper[13046]: I0308 03:37:18.163508 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d02bd09c-8e6b-40b9-967d-d93b3621ae5b" (UID: "d02bd09c-8e6b-40b9-967d-d93b3621ae5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:18.163941 master-0 kubenswrapper[13046]: I0308 03:37:18.163899 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17dd1ec6-c2b0-46a9-b162-6efee4a883b9" (UID: "17dd1ec6-c2b0-46a9-b162-6efee4a883b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:18.164728 master-0 kubenswrapper[13046]: I0308 03:37:18.164693 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.164789 master-0 kubenswrapper[13046]: I0308 03:37:18.164753 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.164835 master-0 kubenswrapper[13046]: I0308 03:37:18.164806 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.164988 master-0 kubenswrapper[13046]: I0308 03:37:18.164818 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95995527-8076-43fb-8a0e-a9678030ad5e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.165786 master-0 kubenswrapper[13046]: I0308 03:37:18.165747 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc" (OuterVolumeSpecName: "kube-api-access-vzkbc") pod "95995527-8076-43fb-8a0e-a9678030ad5e" (UID: "95995527-8076-43fb-8a0e-a9678030ad5e"). InnerVolumeSpecName "kube-api-access-vzkbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:18.165949 master-0 kubenswrapper[13046]: I0308 03:37:18.165885 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8" (OuterVolumeSpecName: "kube-api-access-6k8l8") pod "d02bd09c-8e6b-40b9-967d-d93b3621ae5b" (UID: "d02bd09c-8e6b-40b9-967d-d93b3621ae5b"). InnerVolumeSpecName "kube-api-access-6k8l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:18.167150 master-0 kubenswrapper[13046]: I0308 03:37:18.167113 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9" (OuterVolumeSpecName: "kube-api-access-ztmb9") pod "3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" (UID: "3d3025d7-d6e2-42c7-8352-ec8199a2e9ee"). InnerVolumeSpecName "kube-api-access-ztmb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:18.176881 master-0 kubenswrapper[13046]: I0308 03:37:18.176816 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk" (OuterVolumeSpecName: "kube-api-access-xcjdk") pod "17dd1ec6-c2b0-46a9-b162-6efee4a883b9" (UID: "17dd1ec6-c2b0-46a9-b162-6efee4a883b9"). InnerVolumeSpecName "kube-api-access-xcjdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:18.268539 master-0 kubenswrapper[13046]: I0308 03:37:18.266722 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee-kube-api-access-ztmb9\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.268539 master-0 kubenswrapper[13046]: I0308 03:37:18.266792 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkbc\" (UniqueName: \"kubernetes.io/projected/95995527-8076-43fb-8a0e-a9678030ad5e-kube-api-access-vzkbc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.268539 master-0 kubenswrapper[13046]: I0308 03:37:18.266810 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcjdk\" (UniqueName: \"kubernetes.io/projected/17dd1ec6-c2b0-46a9-b162-6efee4a883b9-kube-api-access-xcjdk\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:18.268539 master-0 kubenswrapper[13046]: I0308 03:37:18.266828 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k8l8\" (UniqueName: \"kubernetes.io/projected/d02bd09c-8e6b-40b9-967d-d93b3621ae5b-kube-api-access-6k8l8\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:19.261604 master-0 kubenswrapper[13046]: I0308 03:37:19.261477 13046 trace.go:236] Trace[911692271]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (08-Mar-2026 03:37:18.131) (total time: 1130ms): Mar 08 03:37:19.261604 master-0 kubenswrapper[13046]: Trace[911692271]: [1.130259169s] [1.130259169s] END Mar 08 03:37:21.193515 master-0 kubenswrapper[13046]: I0308 03:37:21.193406 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e48517-115a-43d0-ad79-a342efe0cf49","Type":"ContainerDied","Data":"fcd397f1176ae3be841d4933fab3fb6f7baaa4572b3a12048ac867b352020bf7"} Mar 08 03:37:21.195025 master-0 kubenswrapper[13046]: I0308 03:37:21.193348 13046 generic.go:334] "Generic (PLEG): container finished" podID="53e48517-115a-43d0-ad79-a342efe0cf49" containerID="fcd397f1176ae3be841d4933fab3fb6f7baaa4572b3a12048ac867b352020bf7" exitCode=0 Mar 08 03:37:21.199806 master-0 kubenswrapper[13046]: I0308 03:37:21.199439 13046 generic.go:334] "Generic (PLEG): container finished" podID="f646560d-325d-41dc-ac99-a36f08ba0149" containerID="d9940db9a3f723e912a488c34cf3ca80600986c5b2cbcc09c58cacc4e7000330" exitCode=0 Mar 08 03:37:21.199806 master-0 kubenswrapper[13046]: I0308 03:37:21.199565 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbd5b" event={"ID":"f646560d-325d-41dc-ac99-a36f08ba0149","Type":"ContainerDied","Data":"d9940db9a3f723e912a488c34cf3ca80600986c5b2cbcc09c58cacc4e7000330"} Mar 08 03:37:21.204367 master-0 kubenswrapper[13046]: I0308 03:37:21.204255 13046 generic.go:334] "Generic (PLEG): container finished" podID="5d102cd0-1a7f-4196-883c-bf2fd94fc7f2" containerID="b8b3d996d279e51508fd7b5c06b8a98d8db482cef60ca5b9e6f6accc57dc2f25" exitCode=0 Mar 08 03:37:21.204688 master-0 kubenswrapper[13046]: I0308 03:37:21.204354 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2","Type":"ContainerDied","Data":"b8b3d996d279e51508fd7b5c06b8a98d8db482cef60ca5b9e6f6accc57dc2f25"} Mar 08 03:37:21.670341 master-0 kubenswrapper[13046]: I0308 03:37:21.670279 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76ff7d945-k9nrf" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.168:5353: i/o timeout" Mar 08 03:37:22.183683 master-0 kubenswrapper[13046]: I0308 03:37:22.183630 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vvfxv"] Mar 08 03:37:22.184355 master-0 kubenswrapper[13046]: E0308 03:37:22.184338 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dd1ec6-c2b0-46a9-b162-6efee4a883b9" containerName="mariadb-account-create-update" Mar 08 03:37:22.184426 master-0 kubenswrapper[13046]: I0308 03:37:22.184416 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dd1ec6-c2b0-46a9-b162-6efee4a883b9" containerName="mariadb-account-create-update" Mar 08 03:37:22.185179 master-0 kubenswrapper[13046]: E0308 03:37:22.185162 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95995527-8076-43fb-8a0e-a9678030ad5e" containerName="mariadb-account-create-update" Mar 08 03:37:22.185267 master-0 kubenswrapper[13046]: I0308 03:37:22.185257 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="95995527-8076-43fb-8a0e-a9678030ad5e" containerName="mariadb-account-create-update" Mar 08 03:37:22.185340 master-0 kubenswrapper[13046]: E0308 03:37:22.185330 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a15b78-56e3-49f1-985a-683f0f3ffde9" containerName="mariadb-account-create-update" Mar 08 03:37:22.185398 master-0 kubenswrapper[13046]: I0308 03:37:22.185388 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a15b78-56e3-49f1-985a-683f0f3ffde9" containerName="mariadb-account-create-update" Mar 08 03:37:22.185457 master-0 kubenswrapper[13046]: E0308 03:37:22.185447 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="dnsmasq-dns" Mar 08 03:37:22.185551 master-0 kubenswrapper[13046]: I0308 03:37:22.185541 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="dnsmasq-dns" Mar 08 03:37:22.185616 master-0 kubenswrapper[13046]: E0308 03:37:22.185605 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd2c544-a565-4f70-978b-667ba7c35a57" containerName="mariadb-database-create" Mar 08 03:37:22.185678 master-0 kubenswrapper[13046]: I0308 03:37:22.185668 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd2c544-a565-4f70-978b-667ba7c35a57" containerName="mariadb-database-create" Mar 08 03:37:22.185753 master-0 kubenswrapper[13046]: E0308 03:37:22.185743 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" containerName="mariadb-account-create-update" Mar 08 03:37:22.185808 master-0 kubenswrapper[13046]: I0308 03:37:22.185799 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" containerName="mariadb-account-create-update" Mar 08 03:37:22.185866 master-0 kubenswrapper[13046]: E0308 03:37:22.185857 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="init" Mar 08 03:37:22.185928 master-0 kubenswrapper[13046]: I0308 03:37:22.185918 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="init" Mar 08 03:37:22.185992 master-0 kubenswrapper[13046]: E0308 03:37:22.185982 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0388d7-fdd0-4f0b-9614-8122eb3258ec" containerName="mariadb-database-create" Mar 08 03:37:22.186047 master-0 kubenswrapper[13046]: I0308 03:37:22.186038 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0388d7-fdd0-4f0b-9614-8122eb3258ec" containerName="mariadb-database-create" Mar 08 03:37:22.186120 master-0 kubenswrapper[13046]: E0308 03:37:22.186111 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02bd09c-8e6b-40b9-967d-d93b3621ae5b" containerName="mariadb-database-create" Mar 08 03:37:22.186188 master-0 kubenswrapper[13046]: I0308 03:37:22.186178 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02bd09c-8e6b-40b9-967d-d93b3621ae5b" containerName="mariadb-database-create" Mar 08 03:37:22.186429 master-0 kubenswrapper[13046]: I0308 03:37:22.186416 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="95995527-8076-43fb-8a0e-a9678030ad5e" containerName="mariadb-account-create-update" Mar 08 03:37:22.186518 master-0 kubenswrapper[13046]: I0308 03:37:22.186508 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd2c544-a565-4f70-978b-667ba7c35a57" containerName="mariadb-database-create" Mar 08 03:37:22.186719 master-0 kubenswrapper[13046]: I0308 03:37:22.186583 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0388d7-fdd0-4f0b-9614-8122eb3258ec" containerName="mariadb-database-create" Mar 08 03:37:22.186794 master-0 kubenswrapper[13046]: I0308 03:37:22.186783 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02bd09c-8e6b-40b9-967d-d93b3621ae5b" containerName="mariadb-database-create" Mar 08 03:37:22.186863 master-0 kubenswrapper[13046]: I0308 03:37:22.186850 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c2671c-0337-4af7-8a29-eef713b62f67" containerName="dnsmasq-dns" Mar 08 03:37:22.186926 master-0 kubenswrapper[13046]: I0308 03:37:22.186916 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="17dd1ec6-c2b0-46a9-b162-6efee4a883b9" containerName="mariadb-account-create-update" Mar 08 03:37:22.186989 master-0 kubenswrapper[13046]: I0308 03:37:22.186980 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" containerName="mariadb-account-create-update" Mar 08 03:37:22.187053 master-0 kubenswrapper[13046]: I0308 03:37:22.187044 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a15b78-56e3-49f1-985a-683f0f3ffde9" containerName="mariadb-account-create-update" Mar 08 03:37:22.187770 master-0 kubenswrapper[13046]: I0308 03:37:22.187752 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.199172 master-0 kubenswrapper[13046]: I0308 03:37:22.199110 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-config-data" Mar 08 03:37:22.226355 master-0 kubenswrapper[13046]: I0308 03:37:22.226282 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vvfxv"] Mar 08 03:37:22.254160 master-0 kubenswrapper[13046]: I0308 03:37:22.244817 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5d102cd0-1a7f-4196-883c-bf2fd94fc7f2","Type":"ContainerStarted","Data":"69cc8fad30c24e77add49aa083dade5e100a1afd07304249f5837f5d704a1b95"} Mar 08 03:37:22.254160 master-0 kubenswrapper[13046]: I0308 03:37:22.247089 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:37:22.254160 master-0 kubenswrapper[13046]: I0308 03:37:22.249030 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e48517-115a-43d0-ad79-a342efe0cf49","Type":"ContainerStarted","Data":"b47592246eb0b39a0ad3d8748bcca35e1a1d34fe260b472446b479d147f97fe0"} Mar 08 03:37:22.254160 master-0 kubenswrapper[13046]: I0308 03:37:22.249386 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 03:37:22.266643 master-0 kubenswrapper[13046]: I0308 03:37:22.266581 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjwk\" (UniqueName: \"kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.267343 master-0 kubenswrapper[13046]: I0308 03:37:22.267316 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.267716 master-0 kubenswrapper[13046]: I0308 03:37:22.267695 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.267908 master-0 kubenswrapper[13046]: I0308 03:37:22.267890 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.311047 master-0 kubenswrapper[13046]: I0308 03:37:22.310924 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.911478985 podStartE2EDuration="56.310897395s" podCreationTimestamp="2026-03-08 03:36:26 +0000 UTC" firstStartedPulling="2026-03-08 03:36:39.906641251 +0000 UTC m=+1401.985408468" lastFinishedPulling="2026-03-08 03:36:47.306059641 +0000 UTC m=+1409.384826878" observedRunningTime="2026-03-08 03:37:22.288053367 +0000 UTC m=+1444.366820594" watchObservedRunningTime="2026-03-08 03:37:22.310897395 +0000 UTC m=+1444.389664612" Mar 08 03:37:22.337873 master-0 kubenswrapper[13046]: I0308 03:37:22.337769 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.437313245 podStartE2EDuration="56.337752306s" podCreationTimestamp="2026-03-08 03:36:26 +0000 UTC" firstStartedPulling="2026-03-08 03:36:45.445215232 +0000 UTC m=+1407.523982449" lastFinishedPulling="2026-03-08 03:36:47.345654293 +0000 UTC m=+1409.424421510" observedRunningTime="2026-03-08 03:37:22.32800754 +0000 UTC m=+1444.406774757" watchObservedRunningTime="2026-03-08 03:37:22.337752306 +0000 UTC m=+1444.416519523" Mar 08 03:37:22.372521 master-0 kubenswrapper[13046]: I0308 03:37:22.369722 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdjwk\" (UniqueName: \"kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.372521 master-0 kubenswrapper[13046]: I0308 03:37:22.370429 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.372521 master-0 kubenswrapper[13046]: I0308 03:37:22.370751 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.372521 master-0 kubenswrapper[13046]: I0308 03:37:22.370923 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.374569 master-0 kubenswrapper[13046]: I0308 03:37:22.373521 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.375005 master-0 kubenswrapper[13046]: I0308 03:37:22.374983 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.379223 master-0 kubenswrapper[13046]: I0308 03:37:22.379183 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.404630 master-0 kubenswrapper[13046]: I0308 03:37:22.390696 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdjwk\" (UniqueName: \"kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk\") pod \"glance-db-sync-vvfxv\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.458090 master-0 kubenswrapper[13046]: I0308 03:37:22.457253 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mtgtj"] Mar 08 03:37:22.465993 master-0 kubenswrapper[13046]: I0308 03:37:22.465945 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mtgtj"] Mar 08 03:37:22.512327 master-0 kubenswrapper[13046]: I0308 03:37:22.512260 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:22.656340 master-0 kubenswrapper[13046]: I0308 03:37:22.655874 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681063 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681162 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681218 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681368 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681391 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681467 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.681857 master-0 kubenswrapper[13046]: I0308 03:37:22.681594 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb5h7\" (UniqueName: \"kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7\") pod \"f646560d-325d-41dc-ac99-a36f08ba0149\" (UID: \"f646560d-325d-41dc-ac99-a36f08ba0149\") " Mar 08 03:37:22.687459 master-0 kubenswrapper[13046]: I0308 03:37:22.686625 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:22.699945 master-0 kubenswrapper[13046]: I0308 03:37:22.688548 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7" (OuterVolumeSpecName: "kube-api-access-mb5h7") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "kube-api-access-mb5h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:22.699945 master-0 kubenswrapper[13046]: I0308 03:37:22.688875 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:37:22.707528 master-0 kubenswrapper[13046]: I0308 03:37:22.707438 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:22.709178 master-0 kubenswrapper[13046]: I0308 03:37:22.708098 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts" (OuterVolumeSpecName: "scripts") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:22.746064 master-0 kubenswrapper[13046]: I0308 03:37:22.743012 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:22.784958 master-0 kubenswrapper[13046]: I0308 03:37:22.784877 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.784994 13046 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.785012 13046 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f646560d-325d-41dc-ac99-a36f08ba0149-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.785024 13046 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.785037 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.785047 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f646560d-325d-41dc-ac99-a36f08ba0149-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.785298 master-0 kubenswrapper[13046]: I0308 03:37:22.785056 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb5h7\" (UniqueName: \"kubernetes.io/projected/f646560d-325d-41dc-ac99-a36f08ba0149-kube-api-access-mb5h7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:22.795090 master-0 kubenswrapper[13046]: I0308 03:37:22.792384 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdcf318b-3d3e-42da-ae4b-39c6a17f8437-etc-swift\") pod \"swift-storage-0\" (UID: \"bdcf318b-3d3e-42da-ae4b-39c6a17f8437\") " pod="openstack/swift-storage-0" Mar 08 03:37:22.795720 master-0 kubenswrapper[13046]: I0308 03:37:22.795664 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f646560d-325d-41dc-ac99-a36f08ba0149" (UID: "f646560d-325d-41dc-ac99-a36f08ba0149"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:22.802916 master-0 kubenswrapper[13046]: I0308 03:37:22.802769 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 03:37:22.889270 master-0 kubenswrapper[13046]: I0308 03:37:22.888805 13046 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f646560d-325d-41dc-ac99-a36f08ba0149-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:23.256240 master-0 kubenswrapper[13046]: I0308 03:37:23.256167 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vvfxv"] Mar 08 03:37:23.264648 master-0 kubenswrapper[13046]: I0308 03:37:23.263810 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tbd5b" event={"ID":"f646560d-325d-41dc-ac99-a36f08ba0149","Type":"ContainerDied","Data":"f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6"} Mar 08 03:37:23.264776 master-0 kubenswrapper[13046]: I0308 03:37:23.264669 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27ae11df48c1b0ded8c5f1acb4e55b7efa3b1b0e80fc5f84d692c1a9d0d9dc6" Mar 08 03:37:23.265000 master-0 kubenswrapper[13046]: I0308 03:37:23.264776 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tbd5b" Mar 08 03:37:23.307095 master-0 kubenswrapper[13046]: I0308 03:37:23.306705 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 03:37:24.135520 master-0 kubenswrapper[13046]: I0308 03:37:24.134557 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a15b78-56e3-49f1-985a-683f0f3ffde9" path="/var/lib/kubelet/pods/c8a15b78-56e3-49f1-985a-683f0f3ffde9/volumes" Mar 08 03:37:24.288636 master-0 kubenswrapper[13046]: I0308 03:37:24.288547 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"486aac186f0f3c5864458519d6e3941c31929f1a698eeeec92f535f57966f78b"} Mar 08 03:37:24.289918 master-0 kubenswrapper[13046]: I0308 03:37:24.289862 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vvfxv" event={"ID":"1ace6ef2-a01f-4585-8282-c24e3d7a8246","Type":"ContainerStarted","Data":"ecda6fe9b18510252124d8d9977dbb2290bf57ff1dd53058c461d95692856a0d"} Mar 08 03:37:24.908325 master-0 kubenswrapper[13046]: I0308 03:37:24.908234 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 03:37:26.316080 master-0 kubenswrapper[13046]: I0308 03:37:26.315993 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"7bbf364e8cd5cda252a40cc9b74335c9b10eccbe75021c52e4826f51c116dc54"} Mar 08 03:37:26.316080 master-0 kubenswrapper[13046]: I0308 03:37:26.316076 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"9834c4059175e2b11c62f594ddd567e711b00bb6c503f22e59ccc283dc60ab03"} Mar 08 03:37:26.316713 master-0 kubenswrapper[13046]: I0308 03:37:26.316092 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"8d6b98e2780b76185fd2e3c6a52168071f9b1f7622f3fd8b93345758637aefab"} Mar 08 03:37:26.316713 master-0 kubenswrapper[13046]: I0308 03:37:26.316104 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"b54c51e47ef42966ec33de7d695bcad78d092d398cc39bb288e96f2aa4ac106a"} Mar 08 03:37:27.472779 master-0 kubenswrapper[13046]: I0308 03:37:27.472720 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dc6lv"] Mar 08 03:37:27.473320 master-0 kubenswrapper[13046]: E0308 03:37:27.473229 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f646560d-325d-41dc-ac99-a36f08ba0149" containerName="swift-ring-rebalance" Mar 08 03:37:27.473320 master-0 kubenswrapper[13046]: I0308 03:37:27.473249 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f646560d-325d-41dc-ac99-a36f08ba0149" containerName="swift-ring-rebalance" Mar 08 03:37:27.473575 master-0 kubenswrapper[13046]: I0308 03:37:27.473551 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f646560d-325d-41dc-ac99-a36f08ba0149" containerName="swift-ring-rebalance" Mar 08 03:37:27.479103 master-0 kubenswrapper[13046]: I0308 03:37:27.476633 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.481697 master-0 kubenswrapper[13046]: I0308 03:37:27.479683 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 03:37:27.492771 master-0 kubenswrapper[13046]: I0308 03:37:27.492706 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dc6lv"] Mar 08 03:37:27.516037 master-0 kubenswrapper[13046]: I0308 03:37:27.512695 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.516037 master-0 kubenswrapper[13046]: I0308 03:37:27.512822 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gds9c\" (UniqueName: \"kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.614479 master-0 kubenswrapper[13046]: I0308 03:37:27.614353 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.614891 master-0 kubenswrapper[13046]: I0308 03:37:27.614565 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gds9c\" (UniqueName: \"kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.615057 master-0 kubenswrapper[13046]: I0308 03:37:27.615009 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.651591 master-0 kubenswrapper[13046]: I0308 03:37:27.650937 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gds9c\" (UniqueName: \"kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c\") pod \"root-account-create-update-dc6lv\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.662325 master-0 kubenswrapper[13046]: I0308 03:37:27.660603 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4wzhj" podUID="fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c" containerName="ovn-controller" probeResult="failure" output=< Mar 08 03:37:27.662325 master-0 kubenswrapper[13046]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 03:37:27.662325 master-0 kubenswrapper[13046]: > Mar 08 03:37:27.677248 master-0 kubenswrapper[13046]: I0308 03:37:27.676826 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:37:27.689811 master-0 kubenswrapper[13046]: I0308 03:37:27.689763 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5d2gw" Mar 08 03:37:27.828783 master-0 kubenswrapper[13046]: I0308 03:37:27.828715 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:27.999661 master-0 kubenswrapper[13046]: I0308 03:37:27.960906 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4wzhj-config-975fs"] Mar 08 03:37:27.999661 master-0 kubenswrapper[13046]: I0308 03:37:27.962310 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:27.999661 master-0 kubenswrapper[13046]: I0308 03:37:27.964933 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 03:37:27.999661 master-0 kubenswrapper[13046]: I0308 03:37:27.982684 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wzhj-config-975fs"] Mar 08 03:37:28.024313 master-0 kubenswrapper[13046]: I0308 03:37:28.024193 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.024419 master-0 kubenswrapper[13046]: I0308 03:37:28.024359 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.024419 master-0 kubenswrapper[13046]: I0308 03:37:28.024395 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j424\" (UniqueName: \"kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.024507 master-0 kubenswrapper[13046]: I0308 03:37:28.024464 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.024586 master-0 kubenswrapper[13046]: I0308 03:37:28.024567 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.024694 master-0 kubenswrapper[13046]: I0308 03:37:28.024640 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128162 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128302 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128336 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j424\" (UniqueName: \"kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128386 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128447 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.128793 master-0 kubenswrapper[13046]: I0308 03:37:28.128519 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.132233 master-0 kubenswrapper[13046]: I0308 03:37:28.131925 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.133614 master-0 kubenswrapper[13046]: I0308 03:37:28.133565 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.133687 master-0 kubenswrapper[13046]: I0308 03:37:28.133645 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.133947 master-0 kubenswrapper[13046]: I0308 03:37:28.133888 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.135570 master-0 kubenswrapper[13046]: I0308 03:37:28.135429 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.151870 master-0 kubenswrapper[13046]: I0308 03:37:28.151726 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j424\" (UniqueName: \"kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424\") pod \"ovn-controller-4wzhj-config-975fs\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.331523 master-0 kubenswrapper[13046]: I0308 03:37:28.330012 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:28.371509 master-0 kubenswrapper[13046]: I0308 03:37:28.369578 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"bb3fa675e40103227963ba23eaa50eb0bc56001774236f89f1be16c3d9498ac8"} Mar 08 03:37:28.371509 master-0 kubenswrapper[13046]: I0308 03:37:28.369643 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"d566bf63a5b34e47da5c06a1fba3635e080b3e65b13c8f51e28bfc20256211cb"} Mar 08 03:37:28.371509 master-0 kubenswrapper[13046]: I0308 03:37:28.369662 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"e3eea7541145f8d6b90ff7fa4148bcf44ebb26b5ab3d2ec30ed609be77e23ced"} Mar 08 03:37:28.371509 master-0 kubenswrapper[13046]: I0308 03:37:28.369676 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"d39012af6c20999e5653cb3ba443952c38a6ea6b0374a9d4df24045c1c0b7da2"} Mar 08 03:37:28.454745 master-0 kubenswrapper[13046]: I0308 03:37:28.454693 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dc6lv"] Mar 08 03:37:28.776597 master-0 kubenswrapper[13046]: I0308 03:37:28.776521 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4wzhj-config-975fs"] Mar 08 03:37:29.387780 master-0 kubenswrapper[13046]: I0308 03:37:29.386377 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj-config-975fs" event={"ID":"68358212-00fe-4d13-9c37-70c62360ead0","Type":"ContainerStarted","Data":"ddf754030224c81da1358e95a2d6672e0d5d7b9ca883aa6e3f0f767ef0c9aa66"} Mar 08 03:37:29.387780 master-0 kubenswrapper[13046]: I0308 03:37:29.386453 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj-config-975fs" event={"ID":"68358212-00fe-4d13-9c37-70c62360ead0","Type":"ContainerStarted","Data":"5f2453efc99b2c2d4f472c3146c1c1dcadb4bb980759f3379dc7040f7ee653f9"} Mar 08 03:37:29.408555 master-0 kubenswrapper[13046]: I0308 03:37:29.408184 13046 generic.go:334] "Generic (PLEG): container finished" podID="3e1689a9-e497-4ef2-9cf9-1622e16965d1" containerID="ffcd9b52b57575139419d8472d69f62209c05803d8feecc51fdc332be824fbe6" exitCode=0 Mar 08 03:37:29.408555 master-0 kubenswrapper[13046]: I0308 03:37:29.408284 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dc6lv" event={"ID":"3e1689a9-e497-4ef2-9cf9-1622e16965d1","Type":"ContainerDied","Data":"ffcd9b52b57575139419d8472d69f62209c05803d8feecc51fdc332be824fbe6"} Mar 08 03:37:29.408555 master-0 kubenswrapper[13046]: I0308 03:37:29.408317 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dc6lv" event={"ID":"3e1689a9-e497-4ef2-9cf9-1622e16965d1","Type":"ContainerStarted","Data":"13ffa3cebc8a1d0a3611a3ac80423667ddd9eeb44b117444a35cc1f66e7473ca"} Mar 08 03:37:29.448504 master-0 kubenswrapper[13046]: I0308 03:37:29.443416 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4wzhj-config-975fs" podStartSLOduration=2.44339917 podStartE2EDuration="2.44339917s" podCreationTimestamp="2026-03-08 03:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:29.442614158 +0000 UTC m=+1451.521381375" watchObservedRunningTime="2026-03-08 03:37:29.44339917 +0000 UTC m=+1451.522166387" Mar 08 03:37:30.440550 master-0 kubenswrapper[13046]: I0308 03:37:30.438071 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"0a2ba9c1e606e7695f7feb0246b58b106cb59fdb4252208faaad8cfec29b2278"} Mar 08 03:37:30.440550 master-0 kubenswrapper[13046]: I0308 03:37:30.438168 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"02a99beb2448c490106949a06b1dc5b8a5b580048db5eef4501f9cf1ade78263"} Mar 08 03:37:30.440550 master-0 kubenswrapper[13046]: I0308 03:37:30.438180 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"802c6773e1557574285830c652554d0e5eaed7b6d688d5b170ca17d65e957c84"} Mar 08 03:37:30.440550 master-0 kubenswrapper[13046]: I0308 03:37:30.438191 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"28b0b453ade60c6799e83100529117c6c72a35e410ae374cb72811545ca05ff9"} Mar 08 03:37:30.453670 master-0 kubenswrapper[13046]: I0308 03:37:30.453596 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj-config-975fs" event={"ID":"68358212-00fe-4d13-9c37-70c62360ead0","Type":"ContainerDied","Data":"ddf754030224c81da1358e95a2d6672e0d5d7b9ca883aa6e3f0f767ef0c9aa66"} Mar 08 03:37:30.453911 master-0 kubenswrapper[13046]: I0308 03:37:30.453745 13046 generic.go:334] "Generic (PLEG): container finished" podID="68358212-00fe-4d13-9c37-70c62360ead0" containerID="ddf754030224c81da1358e95a2d6672e0d5d7b9ca883aa6e3f0f767ef0c9aa66" exitCode=0 Mar 08 03:37:31.046506 master-0 kubenswrapper[13046]: I0308 03:37:31.046017 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:31.123430 master-0 kubenswrapper[13046]: I0308 03:37:31.123367 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts\") pod \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " Mar 08 03:37:31.123668 master-0 kubenswrapper[13046]: I0308 03:37:31.123647 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gds9c\" (UniqueName: \"kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c\") pod \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\" (UID: \"3e1689a9-e497-4ef2-9cf9-1622e16965d1\") " Mar 08 03:37:31.124003 master-0 kubenswrapper[13046]: I0308 03:37:31.123957 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e1689a9-e497-4ef2-9cf9-1622e16965d1" (UID: "3e1689a9-e497-4ef2-9cf9-1622e16965d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:31.124193 master-0 kubenswrapper[13046]: I0308 03:37:31.124168 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e1689a9-e497-4ef2-9cf9-1622e16965d1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:31.129735 master-0 kubenswrapper[13046]: I0308 03:37:31.129699 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c" (OuterVolumeSpecName: "kube-api-access-gds9c") pod "3e1689a9-e497-4ef2-9cf9-1622e16965d1" (UID: "3e1689a9-e497-4ef2-9cf9-1622e16965d1"). InnerVolumeSpecName "kube-api-access-gds9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:31.226543 master-0 kubenswrapper[13046]: I0308 03:37:31.226460 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gds9c\" (UniqueName: \"kubernetes.io/projected/3e1689a9-e497-4ef2-9cf9-1622e16965d1-kube-api-access-gds9c\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:31.471905 master-0 kubenswrapper[13046]: I0308 03:37:31.471828 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dc6lv" event={"ID":"3e1689a9-e497-4ef2-9cf9-1622e16965d1","Type":"ContainerDied","Data":"13ffa3cebc8a1d0a3611a3ac80423667ddd9eeb44b117444a35cc1f66e7473ca"} Mar 08 03:37:31.472508 master-0 kubenswrapper[13046]: I0308 03:37:31.472017 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ffa3cebc8a1d0a3611a3ac80423667ddd9eeb44b117444a35cc1f66e7473ca" Mar 08 03:37:31.472508 master-0 kubenswrapper[13046]: I0308 03:37:31.471843 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dc6lv" Mar 08 03:37:31.480675 master-0 kubenswrapper[13046]: I0308 03:37:31.479063 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"a0da43c7ff868fc142b65af05826d1ada0e9aaeec6b735a77930afe35f7b7605"} Mar 08 03:37:31.480675 master-0 kubenswrapper[13046]: I0308 03:37:31.479117 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"ea5cf73cb3fff9c178e577cf34b37a6351ce0ee7feb61fe8a488b0ff24f786a9"} Mar 08 03:37:31.480675 master-0 kubenswrapper[13046]: I0308 03:37:31.479133 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bdcf318b-3d3e-42da-ae4b-39c6a17f8437","Type":"ContainerStarted","Data":"2637dba830a8e66cd8727aae24a92147bc5a135261c3af2054353ebf28087bff"} Mar 08 03:37:31.929072 master-0 kubenswrapper[13046]: I0308 03:37:31.928991 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.810350371 podStartE2EDuration="27.928970566s" podCreationTimestamp="2026-03-08 03:37:04 +0000 UTC" firstStartedPulling="2026-03-08 03:37:23.304846379 +0000 UTC m=+1445.383613596" lastFinishedPulling="2026-03-08 03:37:29.423466564 +0000 UTC m=+1451.502233791" observedRunningTime="2026-03-08 03:37:31.553139047 +0000 UTC m=+1453.631906284" watchObservedRunningTime="2026-03-08 03:37:31.928970566 +0000 UTC m=+1454.007737783" Mar 08 03:37:31.933885 master-0 kubenswrapper[13046]: I0308 03:37:31.933833 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:37:31.934282 master-0 kubenswrapper[13046]: E0308 03:37:31.934261 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1689a9-e497-4ef2-9cf9-1622e16965d1" containerName="mariadb-account-create-update" Mar 08 03:37:31.934282 master-0 kubenswrapper[13046]: I0308 03:37:31.934279 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1689a9-e497-4ef2-9cf9-1622e16965d1" containerName="mariadb-account-create-update" Mar 08 03:37:31.934524 master-0 kubenswrapper[13046]: I0308 03:37:31.934508 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1689a9-e497-4ef2-9cf9-1622e16965d1" containerName="mariadb-account-create-update" Mar 08 03:37:31.935602 master-0 kubenswrapper[13046]: I0308 03:37:31.935581 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:31.941936 master-0 kubenswrapper[13046]: I0308 03:37:31.941768 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 03:37:31.952468 master-0 kubenswrapper[13046]: I0308 03:37:31.951311 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:37:32.059296 master-0 kubenswrapper[13046]: I0308 03:37:32.059167 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.059296 master-0 kubenswrapper[13046]: I0308 03:37:32.059240 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.059573 master-0 kubenswrapper[13046]: I0308 03:37:32.059464 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.059573 master-0 kubenswrapper[13046]: I0308 03:37:32.059551 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.059682 master-0 kubenswrapper[13046]: I0308 03:37:32.059662 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6zz\" (UniqueName: \"kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.060224 master-0 kubenswrapper[13046]: I0308 03:37:32.060182 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162247 master-0 kubenswrapper[13046]: I0308 03:37:32.162157 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162454 master-0 kubenswrapper[13046]: I0308 03:37:32.162280 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162454 master-0 kubenswrapper[13046]: I0308 03:37:32.162322 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162454 master-0 kubenswrapper[13046]: I0308 03:37:32.162392 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162454 master-0 kubenswrapper[13046]: I0308 03:37:32.162427 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.162676 master-0 kubenswrapper[13046]: I0308 03:37:32.162502 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6zz\" (UniqueName: \"kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.163717 master-0 kubenswrapper[13046]: I0308 03:37:32.163319 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.163717 master-0 kubenswrapper[13046]: I0308 03:37:32.163662 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.164384 master-0 kubenswrapper[13046]: I0308 03:37:32.164343 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.164476 master-0 kubenswrapper[13046]: I0308 03:37:32.164404 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.165301 master-0 kubenswrapper[13046]: I0308 03:37:32.165265 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.187571 master-0 kubenswrapper[13046]: I0308 03:37:32.183953 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6zz\" (UniqueName: \"kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz\") pod \"dnsmasq-dns-6f8f6c88c7-62p4t\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.247744 master-0 kubenswrapper[13046]: I0308 03:37:32.247682 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 03:37:32.259694 master-0 kubenswrapper[13046]: I0308 03:37:32.259630 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:32.664510 master-0 kubenswrapper[13046]: I0308 03:37:32.657795 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4wzhj" Mar 08 03:37:33.694902 master-0 kubenswrapper[13046]: I0308 03:37:33.694842 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 03:37:34.181691 master-0 kubenswrapper[13046]: I0308 03:37:34.181644 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6blrg"] Mar 08 03:37:34.185797 master-0 kubenswrapper[13046]: I0308 03:37:34.185663 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.199984 master-0 kubenswrapper[13046]: I0308 03:37:34.198777 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6blrg"] Mar 08 03:37:34.300653 master-0 kubenswrapper[13046]: I0308 03:37:34.299621 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1cc2-account-create-update-7nfl4"] Mar 08 03:37:34.304109 master-0 kubenswrapper[13046]: I0308 03:37:34.303853 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.306914 master-0 kubenswrapper[13046]: I0308 03:37:34.306882 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 03:37:34.308643 master-0 kubenswrapper[13046]: I0308 03:37:34.308605 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1cc2-account-create-update-7nfl4"] Mar 08 03:37:34.340564 master-0 kubenswrapper[13046]: I0308 03:37:34.327521 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.340564 master-0 kubenswrapper[13046]: I0308 03:37:34.327730 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8bgl\" (UniqueName: \"kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.441143 master-0 kubenswrapper[13046]: I0308 03:37:34.431703 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.441143 master-0 kubenswrapper[13046]: I0308 03:37:34.432013 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8bgl\" (UniqueName: \"kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.441143 master-0 kubenswrapper[13046]: I0308 03:37:34.432777 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s78p9\" (UniqueName: \"kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.441143 master-0 kubenswrapper[13046]: I0308 03:37:34.432860 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.441143 master-0 kubenswrapper[13046]: I0308 03:37:34.434133 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.451817 master-0 kubenswrapper[13046]: I0308 03:37:34.451780 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8bgl\" (UniqueName: \"kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl\") pod \"cinder-db-create-6blrg\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.522251 master-0 kubenswrapper[13046]: I0308 03:37:34.519189 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:34.535873 master-0 kubenswrapper[13046]: I0308 03:37:34.535828 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.536036 master-0 kubenswrapper[13046]: I0308 03:37:34.534957 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.536194 master-0 kubenswrapper[13046]: I0308 03:37:34.536166 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s78p9\" (UniqueName: \"kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.563187 master-0 kubenswrapper[13046]: I0308 03:37:34.563133 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s78p9\" (UniqueName: \"kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9\") pod \"cinder-1cc2-account-create-update-7nfl4\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.585554 master-0 kubenswrapper[13046]: I0308 03:37:34.585505 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-cdm9x"] Mar 08 03:37:34.587239 master-0 kubenswrapper[13046]: I0308 03:37:34.587203 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.631551 master-0 kubenswrapper[13046]: I0308 03:37:34.621806 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cdm9x"] Mar 08 03:37:34.638509 master-0 kubenswrapper[13046]: I0308 03:37:34.637927 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:34.665939 master-0 kubenswrapper[13046]: I0308 03:37:34.665034 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2bt4j"] Mar 08 03:37:34.668594 master-0 kubenswrapper[13046]: I0308 03:37:34.668183 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.680105 master-0 kubenswrapper[13046]: I0308 03:37:34.676935 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 03:37:34.680105 master-0 kubenswrapper[13046]: I0308 03:37:34.677030 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 03:37:34.680105 master-0 kubenswrapper[13046]: I0308 03:37:34.677174 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 03:37:34.723452 master-0 kubenswrapper[13046]: I0308 03:37:34.723369 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2bt4j"] Mar 08 03:37:34.732913 master-0 kubenswrapper[13046]: I0308 03:37:34.732846 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79d6-account-create-update-78z48"] Mar 08 03:37:34.734350 master-0 kubenswrapper[13046]: I0308 03:37:34.734322 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.739941 master-0 kubenswrapper[13046]: I0308 03:37:34.739891 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 03:37:34.744506 master-0 kubenswrapper[13046]: I0308 03:37:34.743737 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79d6-account-create-update-78z48"] Mar 08 03:37:34.755169 master-0 kubenswrapper[13046]: I0308 03:37:34.755116 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.755368 master-0 kubenswrapper[13046]: I0308 03:37:34.755345 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.755443 master-0 kubenswrapper[13046]: I0308 03:37:34.755392 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.755443 master-0 kubenswrapper[13046]: I0308 03:37:34.755418 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz55n\" (UniqueName: \"kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.755533 master-0 kubenswrapper[13046]: I0308 03:37:34.755464 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47wqh\" (UniqueName: \"kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.857862 master-0 kubenswrapper[13046]: I0308 03:37:34.857797 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.857862 master-0 kubenswrapper[13046]: I0308 03:37:34.857875 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.858343 master-0 kubenswrapper[13046]: I0308 03:37:34.857899 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz55n\" (UniqueName: \"kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.858825 master-0 kubenswrapper[13046]: I0308 03:37:34.858653 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.858825 master-0 kubenswrapper[13046]: I0308 03:37:34.858796 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47wqh\" (UniqueName: \"kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.861146 master-0 kubenswrapper[13046]: I0308 03:37:34.858919 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c554\" (UniqueName: \"kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.861146 master-0 kubenswrapper[13046]: I0308 03:37:34.859061 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.861146 master-0 kubenswrapper[13046]: I0308 03:37:34.859833 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.865708 master-0 kubenswrapper[13046]: I0308 03:37:34.865435 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.869152 master-0 kubenswrapper[13046]: I0308 03:37:34.869110 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.877045 master-0 kubenswrapper[13046]: I0308 03:37:34.876995 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz55n\" (UniqueName: \"kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n\") pod \"neutron-db-create-cdm9x\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.877961 master-0 kubenswrapper[13046]: I0308 03:37:34.877924 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47wqh\" (UniqueName: \"kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh\") pod \"keystone-db-sync-2bt4j\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:34.953190 master-0 kubenswrapper[13046]: I0308 03:37:34.953138 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:34.962312 master-0 kubenswrapper[13046]: I0308 03:37:34.960564 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c554\" (UniqueName: \"kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.962312 master-0 kubenswrapper[13046]: I0308 03:37:34.960782 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.962312 master-0 kubenswrapper[13046]: I0308 03:37:34.961465 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:34.978555 master-0 kubenswrapper[13046]: I0308 03:37:34.978385 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c554\" (UniqueName: \"kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554\") pod \"neutron-79d6-account-create-update-78z48\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:35.023710 master-0 kubenswrapper[13046]: I0308 03:37:35.022249 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:35.058569 master-0 kubenswrapper[13046]: I0308 03:37:35.058504 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:39.838870 master-0 kubenswrapper[13046]: I0308 03:37:39.838264 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.974726 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.974778 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.974843 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j424\" (UniqueName: \"kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.974859 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.975018 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.975054 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts\") pod \"68358212-00fe-4d13-9c37-70c62360ead0\" (UID: \"68358212-00fe-4d13-9c37-70c62360ead0\") " Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.975848 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.976531 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts" (OuterVolumeSpecName: "scripts") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.976558 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.976577 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.976593 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run" (OuterVolumeSpecName: "var-run") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:37:39.980505 master-0 kubenswrapper[13046]: I0308 03:37:39.979778 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424" (OuterVolumeSpecName: "kube-api-access-8j424") pod "68358212-00fe-4d13-9c37-70c62360ead0" (UID: "68358212-00fe-4d13-9c37-70c62360ead0"). InnerVolumeSpecName "kube-api-access-8j424". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078383 13046 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078413 13046 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078422 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078432 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j424\" (UniqueName: \"kubernetes.io/projected/68358212-00fe-4d13-9c37-70c62360ead0-kube-api-access-8j424\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078444 13046 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68358212-00fe-4d13-9c37-70c62360ead0-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.078496 master-0 kubenswrapper[13046]: I0308 03:37:40.078452 13046 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/68358212-00fe-4d13-9c37-70c62360ead0-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:40.501587 master-0 kubenswrapper[13046]: I0308 03:37:40.501468 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:37:40.523273 master-0 kubenswrapper[13046]: I0308 03:37:40.523150 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1cc2-account-create-update-7nfl4"] Mar 08 03:37:40.533376 master-0 kubenswrapper[13046]: W0308 03:37:40.533327 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d02515_fb93_427a_9f9f_d97a1e68ec30.slice/crio-1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e WatchSource:0}: Error finding container 1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e: Status 404 returned error can't find the container with id 1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e Mar 08 03:37:40.535927 master-0 kubenswrapper[13046]: W0308 03:37:40.535785 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5eac99_34cd_4cdf_af11_d7475573518d.slice/crio-7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330 WatchSource:0}: Error finding container 7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330: Status 404 returned error can't find the container with id 7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330 Mar 08 03:37:40.565346 master-0 kubenswrapper[13046]: I0308 03:37:40.564043 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6blrg"] Mar 08 03:37:40.585970 master-0 kubenswrapper[13046]: I0308 03:37:40.583126 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79d6-account-create-update-78z48"] Mar 08 03:37:40.592558 master-0 kubenswrapper[13046]: I0308 03:37:40.586651 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vvfxv" event={"ID":"1ace6ef2-a01f-4585-8282-c24e3d7a8246","Type":"ContainerStarted","Data":"0bc43cb2fb95a1524b067c67123e4f5b9af00b5c2abb9f0071b25bb073e375d3"} Mar 08 03:37:40.601889 master-0 kubenswrapper[13046]: I0308 03:37:40.601800 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6blrg" event={"ID":"fb5eac99-34cd-4cdf-af11-d7475573518d","Type":"ContainerStarted","Data":"7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330"} Mar 08 03:37:40.603544 master-0 kubenswrapper[13046]: I0308 03:37:40.603495 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79d6-account-create-update-78z48" event={"ID":"a8d02515-fb93-427a-9f9f-d97a1e68ec30","Type":"ContainerStarted","Data":"1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e"} Mar 08 03:37:40.614510 master-0 kubenswrapper[13046]: I0308 03:37:40.613811 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" event={"ID":"6312c2ac-32ce-4040-91dd-3c0193d10918","Type":"ContainerStarted","Data":"f4e4f0f73b310e3496256add05e6ec7e9b49c0db7aa120fb27ae61785008faea"} Mar 08 03:37:40.621864 master-0 kubenswrapper[13046]: I0308 03:37:40.621791 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vvfxv" podStartSLOduration=2.046810974 podStartE2EDuration="18.621770509s" podCreationTimestamp="2026-03-08 03:37:22 +0000 UTC" firstStartedPulling="2026-03-08 03:37:23.262790477 +0000 UTC m=+1445.341557684" lastFinishedPulling="2026-03-08 03:37:39.837750002 +0000 UTC m=+1461.916517219" observedRunningTime="2026-03-08 03:37:40.606446553 +0000 UTC m=+1462.685213780" watchObservedRunningTime="2026-03-08 03:37:40.621770509 +0000 UTC m=+1462.700537726" Mar 08 03:37:40.629794 master-0 kubenswrapper[13046]: I0308 03:37:40.629723 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1cc2-account-create-update-7nfl4" event={"ID":"d26dfae9-3c54-4102-8762-903c01f9eb23","Type":"ContainerStarted","Data":"3c817f5bb27bfb8aed21681e795f934b9895e12903960218d72035116dfee6df"} Mar 08 03:37:40.631967 master-0 kubenswrapper[13046]: I0308 03:37:40.631910 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4wzhj-config-975fs" event={"ID":"68358212-00fe-4d13-9c37-70c62360ead0","Type":"ContainerDied","Data":"5f2453efc99b2c2d4f472c3146c1c1dcadb4bb980759f3379dc7040f7ee653f9"} Mar 08 03:37:40.632092 master-0 kubenswrapper[13046]: I0308 03:37:40.631972 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2453efc99b2c2d4f472c3146c1c1dcadb4bb980759f3379dc7040f7ee653f9" Mar 08 03:37:40.632149 master-0 kubenswrapper[13046]: I0308 03:37:40.632129 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4wzhj-config-975fs" Mar 08 03:37:40.730693 master-0 kubenswrapper[13046]: W0308 03:37:40.730570 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1f8575_4138_44d5_9be6_14a70bf8170c.slice/crio-25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377 WatchSource:0}: Error finding container 25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377: Status 404 returned error can't find the container with id 25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377 Mar 08 03:37:40.739794 master-0 kubenswrapper[13046]: W0308 03:37:40.739748 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d7f97b4_52cc_4108_95c4_dc762cd1398a.slice/crio-5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53 WatchSource:0}: Error finding container 5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53: Status 404 returned error can't find the container with id 5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53 Mar 08 03:37:40.739909 master-0 kubenswrapper[13046]: I0308 03:37:40.739887 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2bt4j"] Mar 08 03:37:40.751757 master-0 kubenswrapper[13046]: I0308 03:37:40.751675 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-cdm9x"] Mar 08 03:37:40.977393 master-0 kubenswrapper[13046]: I0308 03:37:40.977283 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4wzhj-config-975fs"] Mar 08 03:37:40.994374 master-0 kubenswrapper[13046]: I0308 03:37:40.994320 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4wzhj-config-975fs"] Mar 08 03:37:41.647368 master-0 kubenswrapper[13046]: I0308 03:37:41.647283 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1cc2-account-create-update-7nfl4" event={"ID":"d26dfae9-3c54-4102-8762-903c01f9eb23","Type":"ContainerStarted","Data":"c7bba9327c811230db2351bff4a5762e0a574248f1888fc51e9a20c50f9972fb"} Mar 08 03:37:41.653828 master-0 kubenswrapper[13046]: I0308 03:37:41.653746 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6blrg" event={"ID":"fb5eac99-34cd-4cdf-af11-d7475573518d","Type":"ContainerStarted","Data":"a52466d9de2c81b5a9aa547f69f813cb0e0514604ff17f5e8119d310d0a49e89"} Mar 08 03:37:41.655537 master-0 kubenswrapper[13046]: I0308 03:37:41.655463 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bt4j" event={"ID":"7a1f8575-4138-44d5-9be6-14a70bf8170c","Type":"ContainerStarted","Data":"25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377"} Mar 08 03:37:41.657987 master-0 kubenswrapper[13046]: I0308 03:37:41.657890 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cdm9x" event={"ID":"6d7f97b4-52cc-4108-95c4-dc762cd1398a","Type":"ContainerDied","Data":"94cdc5f29a6d860e7bc9f6764c6033f8792d94583a8fb39bfe4d486d089b390f"} Mar 08 03:37:41.657987 master-0 kubenswrapper[13046]: I0308 03:37:41.657872 13046 generic.go:334] "Generic (PLEG): container finished" podID="6d7f97b4-52cc-4108-95c4-dc762cd1398a" containerID="94cdc5f29a6d860e7bc9f6764c6033f8792d94583a8fb39bfe4d486d089b390f" exitCode=0 Mar 08 03:37:41.658233 master-0 kubenswrapper[13046]: I0308 03:37:41.658081 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cdm9x" event={"ID":"6d7f97b4-52cc-4108-95c4-dc762cd1398a","Type":"ContainerStarted","Data":"5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53"} Mar 08 03:37:41.659678 master-0 kubenswrapper[13046]: I0308 03:37:41.659638 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79d6-account-create-update-78z48" event={"ID":"a8d02515-fb93-427a-9f9f-d97a1e68ec30","Type":"ContainerStarted","Data":"37f135cf2b0475a62ec0c2666445995a3e6aec1d5a46e1675726a69ce8cdc369"} Mar 08 03:37:41.662035 master-0 kubenswrapper[13046]: I0308 03:37:41.661991 13046 generic.go:334] "Generic (PLEG): container finished" podID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerID="cf9b9b6be338f8d9f4e83cf459ebda8704f03aa9b7cba1862adc58ad1b5d94f3" exitCode=0 Mar 08 03:37:41.662906 master-0 kubenswrapper[13046]: I0308 03:37:41.662865 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" event={"ID":"6312c2ac-32ce-4040-91dd-3c0193d10918","Type":"ContainerDied","Data":"cf9b9b6be338f8d9f4e83cf459ebda8704f03aa9b7cba1862adc58ad1b5d94f3"} Mar 08 03:37:41.864900 master-0 kubenswrapper[13046]: I0308 03:37:41.863733 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-1cc2-account-create-update-7nfl4" podStartSLOduration=7.863714739 podStartE2EDuration="7.863714739s" podCreationTimestamp="2026-03-08 03:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:41.851674916 +0000 UTC m=+1463.930442133" watchObservedRunningTime="2026-03-08 03:37:41.863714739 +0000 UTC m=+1463.942481956" Mar 08 03:37:41.906575 master-0 kubenswrapper[13046]: I0308 03:37:41.894564 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79d6-account-create-update-78z48" podStartSLOduration=7.894544855 podStartE2EDuration="7.894544855s" podCreationTimestamp="2026-03-08 03:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:41.868807173 +0000 UTC m=+1463.947574390" watchObservedRunningTime="2026-03-08 03:37:41.894544855 +0000 UTC m=+1463.973312072" Mar 08 03:37:41.935342 master-0 kubenswrapper[13046]: I0308 03:37:41.935243 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6blrg" podStartSLOduration=7.93522017 podStartE2EDuration="7.93522017s" podCreationTimestamp="2026-03-08 03:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:41.91407719 +0000 UTC m=+1463.992844417" watchObservedRunningTime="2026-03-08 03:37:41.93522017 +0000 UTC m=+1464.013987387" Mar 08 03:37:42.147096 master-0 kubenswrapper[13046]: I0308 03:37:42.147036 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68358212-00fe-4d13-9c37-70c62360ead0" path="/var/lib/kubelet/pods/68358212-00fe-4d13-9c37-70c62360ead0/volumes" Mar 08 03:37:42.677898 master-0 kubenswrapper[13046]: I0308 03:37:42.677843 13046 generic.go:334] "Generic (PLEG): container finished" podID="fb5eac99-34cd-4cdf-af11-d7475573518d" containerID="a52466d9de2c81b5a9aa547f69f813cb0e0514604ff17f5e8119d310d0a49e89" exitCode=0 Mar 08 03:37:42.678049 master-0 kubenswrapper[13046]: I0308 03:37:42.677936 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6blrg" event={"ID":"fb5eac99-34cd-4cdf-af11-d7475573518d","Type":"ContainerDied","Data":"a52466d9de2c81b5a9aa547f69f813cb0e0514604ff17f5e8119d310d0a49e89"} Mar 08 03:37:42.683822 master-0 kubenswrapper[13046]: I0308 03:37:42.683754 13046 generic.go:334] "Generic (PLEG): container finished" podID="a8d02515-fb93-427a-9f9f-d97a1e68ec30" containerID="37f135cf2b0475a62ec0c2666445995a3e6aec1d5a46e1675726a69ce8cdc369" exitCode=0 Mar 08 03:37:42.683822 master-0 kubenswrapper[13046]: I0308 03:37:42.683825 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79d6-account-create-update-78z48" event={"ID":"a8d02515-fb93-427a-9f9f-d97a1e68ec30","Type":"ContainerDied","Data":"37f135cf2b0475a62ec0c2666445995a3e6aec1d5a46e1675726a69ce8cdc369"} Mar 08 03:37:42.687241 master-0 kubenswrapper[13046]: I0308 03:37:42.687213 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" event={"ID":"6312c2ac-32ce-4040-91dd-3c0193d10918","Type":"ContainerStarted","Data":"c2ff03ed8f6c4c79eafb130fac7fbb943bb63acc7415c283d5142fba3d3e695f"} Mar 08 03:37:42.688075 master-0 kubenswrapper[13046]: I0308 03:37:42.687911 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:42.691322 master-0 kubenswrapper[13046]: I0308 03:37:42.691210 13046 generic.go:334] "Generic (PLEG): container finished" podID="d26dfae9-3c54-4102-8762-903c01f9eb23" containerID="c7bba9327c811230db2351bff4a5762e0a574248f1888fc51e9a20c50f9972fb" exitCode=0 Mar 08 03:37:42.691471 master-0 kubenswrapper[13046]: I0308 03:37:42.691417 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1cc2-account-create-update-7nfl4" event={"ID":"d26dfae9-3c54-4102-8762-903c01f9eb23","Type":"ContainerDied","Data":"c7bba9327c811230db2351bff4a5762e0a574248f1888fc51e9a20c50f9972fb"} Mar 08 03:37:42.789353 master-0 kubenswrapper[13046]: I0308 03:37:42.789252 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" podStartSLOduration=11.789231397 podStartE2EDuration="11.789231397s" podCreationTimestamp="2026-03-08 03:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:42.777881245 +0000 UTC m=+1464.856648462" watchObservedRunningTime="2026-03-08 03:37:42.789231397 +0000 UTC m=+1464.867998614" Mar 08 03:37:46.138425 master-0 kubenswrapper[13046]: I0308 03:37:46.137613 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:46.146496 master-0 kubenswrapper[13046]: I0308 03:37:46.145698 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:46.153620 master-0 kubenswrapper[13046]: I0308 03:37:46.151980 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:46.159388 master-0 kubenswrapper[13046]: I0308 03:37:46.159352 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:46.184804 master-0 kubenswrapper[13046]: I0308 03:37:46.184707 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s78p9\" (UniqueName: \"kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9\") pod \"d26dfae9-3c54-4102-8762-903c01f9eb23\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " Mar 08 03:37:46.184978 master-0 kubenswrapper[13046]: I0308 03:37:46.184822 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts\") pod \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " Mar 08 03:37:46.185064 master-0 kubenswrapper[13046]: I0308 03:37:46.185039 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts\") pod \"d26dfae9-3c54-4102-8762-903c01f9eb23\" (UID: \"d26dfae9-3c54-4102-8762-903c01f9eb23\") " Mar 08 03:37:46.185213 master-0 kubenswrapper[13046]: I0308 03:37:46.185186 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz55n\" (UniqueName: \"kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n\") pod \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\" (UID: \"6d7f97b4-52cc-4108-95c4-dc762cd1398a\") " Mar 08 03:37:46.185256 master-0 kubenswrapper[13046]: I0308 03:37:46.185222 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c554\" (UniqueName: \"kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554\") pod \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " Mar 08 03:37:46.185256 master-0 kubenswrapper[13046]: I0308 03:37:46.185247 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8bgl\" (UniqueName: \"kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl\") pod \"fb5eac99-34cd-4cdf-af11-d7475573518d\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " Mar 08 03:37:46.185330 master-0 kubenswrapper[13046]: I0308 03:37:46.185290 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts\") pod \"fb5eac99-34cd-4cdf-af11-d7475573518d\" (UID: \"fb5eac99-34cd-4cdf-af11-d7475573518d\") " Mar 08 03:37:46.185330 master-0 kubenswrapper[13046]: I0308 03:37:46.185314 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts\") pod \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\" (UID: \"a8d02515-fb93-427a-9f9f-d97a1e68ec30\") " Mar 08 03:37:46.185439 master-0 kubenswrapper[13046]: I0308 03:37:46.185383 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d7f97b4-52cc-4108-95c4-dc762cd1398a" (UID: "6d7f97b4-52cc-4108-95c4-dc762cd1398a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:46.185905 master-0 kubenswrapper[13046]: I0308 03:37:46.185557 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d26dfae9-3c54-4102-8762-903c01f9eb23" (UID: "d26dfae9-3c54-4102-8762-903c01f9eb23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:46.185970 master-0 kubenswrapper[13046]: I0308 03:37:46.185913 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d26dfae9-3c54-4102-8762-903c01f9eb23-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.185970 master-0 kubenswrapper[13046]: I0308 03:37:46.185934 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d7f97b4-52cc-4108-95c4-dc762cd1398a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.186138 master-0 kubenswrapper[13046]: I0308 03:37:46.186111 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb5eac99-34cd-4cdf-af11-d7475573518d" (UID: "fb5eac99-34cd-4cdf-af11-d7475573518d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:46.186233 master-0 kubenswrapper[13046]: I0308 03:37:46.186195 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8d02515-fb93-427a-9f9f-d97a1e68ec30" (UID: "a8d02515-fb93-427a-9f9f-d97a1e68ec30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:46.188430 master-0 kubenswrapper[13046]: I0308 03:37:46.188394 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9" (OuterVolumeSpecName: "kube-api-access-s78p9") pod "d26dfae9-3c54-4102-8762-903c01f9eb23" (UID: "d26dfae9-3c54-4102-8762-903c01f9eb23"). InnerVolumeSpecName "kube-api-access-s78p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:46.189588 master-0 kubenswrapper[13046]: I0308 03:37:46.189518 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n" (OuterVolumeSpecName: "kube-api-access-bz55n") pod "6d7f97b4-52cc-4108-95c4-dc762cd1398a" (UID: "6d7f97b4-52cc-4108-95c4-dc762cd1398a"). InnerVolumeSpecName "kube-api-access-bz55n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:46.190721 master-0 kubenswrapper[13046]: I0308 03:37:46.190683 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl" (OuterVolumeSpecName: "kube-api-access-k8bgl") pod "fb5eac99-34cd-4cdf-af11-d7475573518d" (UID: "fb5eac99-34cd-4cdf-af11-d7475573518d"). InnerVolumeSpecName "kube-api-access-k8bgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:46.190829 master-0 kubenswrapper[13046]: I0308 03:37:46.190782 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554" (OuterVolumeSpecName: "kube-api-access-2c554") pod "a8d02515-fb93-427a-9f9f-d97a1e68ec30" (UID: "a8d02515-fb93-427a-9f9f-d97a1e68ec30"). InnerVolumeSpecName "kube-api-access-2c554". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290166 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s78p9\" (UniqueName: \"kubernetes.io/projected/d26dfae9-3c54-4102-8762-903c01f9eb23-kube-api-access-s78p9\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290230 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz55n\" (UniqueName: \"kubernetes.io/projected/6d7f97b4-52cc-4108-95c4-dc762cd1398a-kube-api-access-bz55n\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290246 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c554\" (UniqueName: \"kubernetes.io/projected/a8d02515-fb93-427a-9f9f-d97a1e68ec30-kube-api-access-2c554\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290260 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8bgl\" (UniqueName: \"kubernetes.io/projected/fb5eac99-34cd-4cdf-af11-d7475573518d-kube-api-access-k8bgl\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290276 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb5eac99-34cd-4cdf-af11-d7475573518d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.290329 master-0 kubenswrapper[13046]: I0308 03:37:46.290290 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d02515-fb93-427a-9f9f-d97a1e68ec30-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:46.754917 master-0 kubenswrapper[13046]: I0308 03:37:46.754847 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-cdm9x" Mar 08 03:37:46.755137 master-0 kubenswrapper[13046]: I0308 03:37:46.755037 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-cdm9x" event={"ID":"6d7f97b4-52cc-4108-95c4-dc762cd1398a","Type":"ContainerDied","Data":"5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53"} Mar 08 03:37:46.755137 master-0 kubenswrapper[13046]: I0308 03:37:46.755074 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5624d27b2df8603121574b7dc1f67b58be112ec605e8a5ecba1a1a7ee6348b53" Mar 08 03:37:46.756351 master-0 kubenswrapper[13046]: I0308 03:37:46.756304 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79d6-account-create-update-78z48" Mar 08 03:37:46.756568 master-0 kubenswrapper[13046]: I0308 03:37:46.756534 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79d6-account-create-update-78z48" event={"ID":"a8d02515-fb93-427a-9f9f-d97a1e68ec30","Type":"ContainerDied","Data":"1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e"} Mar 08 03:37:46.756635 master-0 kubenswrapper[13046]: I0308 03:37:46.756609 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1696c4245e809d1d55873357bc0df6955e5b40f29b68c55170fca69c5e51432e" Mar 08 03:37:46.757734 master-0 kubenswrapper[13046]: I0308 03:37:46.757703 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1cc2-account-create-update-7nfl4" Mar 08 03:37:46.757838 master-0 kubenswrapper[13046]: I0308 03:37:46.757707 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1cc2-account-create-update-7nfl4" event={"ID":"d26dfae9-3c54-4102-8762-903c01f9eb23","Type":"ContainerDied","Data":"3c817f5bb27bfb8aed21681e795f934b9895e12903960218d72035116dfee6df"} Mar 08 03:37:46.757887 master-0 kubenswrapper[13046]: I0308 03:37:46.757852 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c817f5bb27bfb8aed21681e795f934b9895e12903960218d72035116dfee6df" Mar 08 03:37:46.759288 master-0 kubenswrapper[13046]: I0308 03:37:46.759257 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6blrg" event={"ID":"fb5eac99-34cd-4cdf-af11-d7475573518d","Type":"ContainerDied","Data":"7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330"} Mar 08 03:37:46.759288 master-0 kubenswrapper[13046]: I0308 03:37:46.759283 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7793db8eae41791edd6217fdf792ce2a5cd875159239036065a440be33248330" Mar 08 03:37:46.759411 master-0 kubenswrapper[13046]: I0308 03:37:46.759276 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6blrg" Mar 08 03:37:47.263822 master-0 kubenswrapper[13046]: I0308 03:37:47.263743 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:37:47.375286 master-0 kubenswrapper[13046]: I0308 03:37:47.375238 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:47.375845 master-0 kubenswrapper[13046]: I0308 03:37:47.375811 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="dnsmasq-dns" containerID="cri-o://18ca1a76436c4966da4a1dd5931cb3c3ecfdab86f06b2055bde8316c3eaa330c" gracePeriod=10 Mar 08 03:37:47.779365 master-0 kubenswrapper[13046]: I0308 03:37:47.779313 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bt4j" event={"ID":"7a1f8575-4138-44d5-9be6-14a70bf8170c","Type":"ContainerStarted","Data":"fb162992e6bc528bee5f9c0586b2d640cbe00d84c7c185f91dffcdf27cbe0181"} Mar 08 03:37:47.782687 master-0 kubenswrapper[13046]: I0308 03:37:47.782644 13046 generic.go:334] "Generic (PLEG): container finished" podID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerID="18ca1a76436c4966da4a1dd5931cb3c3ecfdab86f06b2055bde8316c3eaa330c" exitCode=0 Mar 08 03:37:47.782784 master-0 kubenswrapper[13046]: I0308 03:37:47.782692 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" event={"ID":"e26fb299-b4b3-4f84-acb9-82afd62a9c39","Type":"ContainerDied","Data":"18ca1a76436c4966da4a1dd5931cb3c3ecfdab86f06b2055bde8316c3eaa330c"} Mar 08 03:37:47.952195 master-0 kubenswrapper[13046]: I0308 03:37:47.952144 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:47.975082 master-0 kubenswrapper[13046]: I0308 03:37:47.975009 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2bt4j" podStartSLOduration=7.815813696 podStartE2EDuration="13.974990849s" podCreationTimestamp="2026-03-08 03:37:34 +0000 UTC" firstStartedPulling="2026-03-08 03:37:40.732818484 +0000 UTC m=+1462.811585701" lastFinishedPulling="2026-03-08 03:37:46.891995617 +0000 UTC m=+1468.970762854" observedRunningTime="2026-03-08 03:37:47.804771012 +0000 UTC m=+1469.883538229" watchObservedRunningTime="2026-03-08 03:37:47.974990849 +0000 UTC m=+1470.053758066" Mar 08 03:37:48.065891 master-0 kubenswrapper[13046]: I0308 03:37:48.065837 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config\") pod \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " Mar 08 03:37:48.066601 master-0 kubenswrapper[13046]: I0308 03:37:48.066524 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqqt7\" (UniqueName: \"kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7\") pod \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " Mar 08 03:37:48.067066 master-0 kubenswrapper[13046]: I0308 03:37:48.066831 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc\") pod \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " Mar 08 03:37:48.067310 master-0 kubenswrapper[13046]: I0308 03:37:48.067296 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb\") pod \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " Mar 08 03:37:48.067747 master-0 kubenswrapper[13046]: I0308 03:37:48.067734 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb\") pod \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\" (UID: \"e26fb299-b4b3-4f84-acb9-82afd62a9c39\") " Mar 08 03:37:48.070810 master-0 kubenswrapper[13046]: I0308 03:37:48.070462 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7" (OuterVolumeSpecName: "kube-api-access-bqqt7") pod "e26fb299-b4b3-4f84-acb9-82afd62a9c39" (UID: "e26fb299-b4b3-4f84-acb9-82afd62a9c39"). InnerVolumeSpecName "kube-api-access-bqqt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:48.109548 master-0 kubenswrapper[13046]: I0308 03:37:48.109499 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config" (OuterVolumeSpecName: "config") pod "e26fb299-b4b3-4f84-acb9-82afd62a9c39" (UID: "e26fb299-b4b3-4f84-acb9-82afd62a9c39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:48.120218 master-0 kubenswrapper[13046]: I0308 03:37:48.117909 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e26fb299-b4b3-4f84-acb9-82afd62a9c39" (UID: "e26fb299-b4b3-4f84-acb9-82afd62a9c39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:48.120218 master-0 kubenswrapper[13046]: I0308 03:37:48.120082 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e26fb299-b4b3-4f84-acb9-82afd62a9c39" (UID: "e26fb299-b4b3-4f84-acb9-82afd62a9c39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:48.123689 master-0 kubenswrapper[13046]: I0308 03:37:48.123660 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e26fb299-b4b3-4f84-acb9-82afd62a9c39" (UID: "e26fb299-b4b3-4f84-acb9-82afd62a9c39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:48.170034 master-0 kubenswrapper[13046]: I0308 03:37:48.169979 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:48.170137 master-0 kubenswrapper[13046]: I0308 03:37:48.170045 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:48.170137 master-0 kubenswrapper[13046]: I0308 03:37:48.170066 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:48.170137 master-0 kubenswrapper[13046]: I0308 03:37:48.170084 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e26fb299-b4b3-4f84-acb9-82afd62a9c39-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:48.170137 master-0 kubenswrapper[13046]: I0308 03:37:48.170104 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqqt7\" (UniqueName: \"kubernetes.io/projected/e26fb299-b4b3-4f84-acb9-82afd62a9c39-kube-api-access-bqqt7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:48.796387 master-0 kubenswrapper[13046]: I0308 03:37:48.796325 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" event={"ID":"e26fb299-b4b3-4f84-acb9-82afd62a9c39","Type":"ContainerDied","Data":"b2569c51cfa7ef60b635cd8aec00a49b1f725894db0715619719c044df012241"} Mar 08 03:37:48.796387 master-0 kubenswrapper[13046]: I0308 03:37:48.796346 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-66sbz" Mar 08 03:37:48.796992 master-0 kubenswrapper[13046]: I0308 03:37:48.796391 13046 scope.go:117] "RemoveContainer" containerID="18ca1a76436c4966da4a1dd5931cb3c3ecfdab86f06b2055bde8316c3eaa330c" Mar 08 03:37:48.826985 master-0 kubenswrapper[13046]: I0308 03:37:48.826928 13046 scope.go:117] "RemoveContainer" containerID="35359c1f2956bed687b83a202dacc2edd597fb6dc3bdc1af75d3e9fcc13607ff" Mar 08 03:37:48.840578 master-0 kubenswrapper[13046]: I0308 03:37:48.840516 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:48.852747 master-0 kubenswrapper[13046]: I0308 03:37:48.852644 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-66sbz"] Mar 08 03:37:50.137742 master-0 kubenswrapper[13046]: I0308 03:37:50.137679 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" path="/var/lib/kubelet/pods/e26fb299-b4b3-4f84-acb9-82afd62a9c39/volumes" Mar 08 03:37:50.821680 master-0 kubenswrapper[13046]: I0308 03:37:50.821622 13046 generic.go:334] "Generic (PLEG): container finished" podID="1ace6ef2-a01f-4585-8282-c24e3d7a8246" containerID="0bc43cb2fb95a1524b067c67123e4f5b9af00b5c2abb9f0071b25bb073e375d3" exitCode=0 Mar 08 03:37:50.821680 master-0 kubenswrapper[13046]: I0308 03:37:50.821683 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vvfxv" event={"ID":"1ace6ef2-a01f-4585-8282-c24e3d7a8246","Type":"ContainerDied","Data":"0bc43cb2fb95a1524b067c67123e4f5b9af00b5c2abb9f0071b25bb073e375d3"} Mar 08 03:37:51.832754 master-0 kubenswrapper[13046]: I0308 03:37:51.832648 13046 generic.go:334] "Generic (PLEG): container finished" podID="7a1f8575-4138-44d5-9be6-14a70bf8170c" containerID="fb162992e6bc528bee5f9c0586b2d640cbe00d84c7c185f91dffcdf27cbe0181" exitCode=0 Mar 08 03:37:51.832754 master-0 kubenswrapper[13046]: I0308 03:37:51.832738 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bt4j" event={"ID":"7a1f8575-4138-44d5-9be6-14a70bf8170c","Type":"ContainerDied","Data":"fb162992e6bc528bee5f9c0586b2d640cbe00d84c7c185f91dffcdf27cbe0181"} Mar 08 03:37:52.397841 master-0 kubenswrapper[13046]: I0308 03:37:52.397762 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:52.467940 master-0 kubenswrapper[13046]: I0308 03:37:52.467877 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdjwk\" (UniqueName: \"kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk\") pod \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " Mar 08 03:37:52.468191 master-0 kubenswrapper[13046]: I0308 03:37:52.468029 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data\") pod \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " Mar 08 03:37:52.468391 master-0 kubenswrapper[13046]: I0308 03:37:52.468353 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data\") pod \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " Mar 08 03:37:52.468604 master-0 kubenswrapper[13046]: I0308 03:37:52.468559 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle\") pod \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\" (UID: \"1ace6ef2-a01f-4585-8282-c24e3d7a8246\") " Mar 08 03:37:52.472457 master-0 kubenswrapper[13046]: I0308 03:37:52.472377 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1ace6ef2-a01f-4585-8282-c24e3d7a8246" (UID: "1ace6ef2-a01f-4585-8282-c24e3d7a8246"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:52.473280 master-0 kubenswrapper[13046]: I0308 03:37:52.473205 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk" (OuterVolumeSpecName: "kube-api-access-xdjwk") pod "1ace6ef2-a01f-4585-8282-c24e3d7a8246" (UID: "1ace6ef2-a01f-4585-8282-c24e3d7a8246"). InnerVolumeSpecName "kube-api-access-xdjwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:52.496522 master-0 kubenswrapper[13046]: I0308 03:37:52.496452 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ace6ef2-a01f-4585-8282-c24e3d7a8246" (UID: "1ace6ef2-a01f-4585-8282-c24e3d7a8246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:52.524717 master-0 kubenswrapper[13046]: I0308 03:37:52.524636 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data" (OuterVolumeSpecName: "config-data") pod "1ace6ef2-a01f-4585-8282-c24e3d7a8246" (UID: "1ace6ef2-a01f-4585-8282-c24e3d7a8246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:52.572011 master-0 kubenswrapper[13046]: I0308 03:37:52.571428 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:52.572011 master-0 kubenswrapper[13046]: I0308 03:37:52.571496 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:52.572011 master-0 kubenswrapper[13046]: I0308 03:37:52.571512 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdjwk\" (UniqueName: \"kubernetes.io/projected/1ace6ef2-a01f-4585-8282-c24e3d7a8246-kube-api-access-xdjwk\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:52.572011 master-0 kubenswrapper[13046]: I0308 03:37:52.571528 13046 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ace6ef2-a01f-4585-8282-c24e3d7a8246-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:52.845765 master-0 kubenswrapper[13046]: I0308 03:37:52.845694 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vvfxv" event={"ID":"1ace6ef2-a01f-4585-8282-c24e3d7a8246","Type":"ContainerDied","Data":"ecda6fe9b18510252124d8d9977dbb2290bf57ff1dd53058c461d95692856a0d"} Mar 08 03:37:52.845765 master-0 kubenswrapper[13046]: I0308 03:37:52.845756 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecda6fe9b18510252124d8d9977dbb2290bf57ff1dd53058c461d95692856a0d" Mar 08 03:37:52.845765 master-0 kubenswrapper[13046]: I0308 03:37:52.845712 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vvfxv" Mar 08 03:37:53.356988 master-0 kubenswrapper[13046]: I0308 03:37:53.356896 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:53.499539 master-0 kubenswrapper[13046]: I0308 03:37:53.499389 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data\") pod \"7a1f8575-4138-44d5-9be6-14a70bf8170c\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " Mar 08 03:37:53.499539 master-0 kubenswrapper[13046]: I0308 03:37:53.499523 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle\") pod \"7a1f8575-4138-44d5-9be6-14a70bf8170c\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " Mar 08 03:37:53.499794 master-0 kubenswrapper[13046]: I0308 03:37:53.499610 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47wqh\" (UniqueName: \"kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh\") pod \"7a1f8575-4138-44d5-9be6-14a70bf8170c\" (UID: \"7a1f8575-4138-44d5-9be6-14a70bf8170c\") " Mar 08 03:37:53.506643 master-0 kubenswrapper[13046]: I0308 03:37:53.505129 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh" (OuterVolumeSpecName: "kube-api-access-47wqh") pod "7a1f8575-4138-44d5-9be6-14a70bf8170c" (UID: "7a1f8575-4138-44d5-9be6-14a70bf8170c"). InnerVolumeSpecName "kube-api-access-47wqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:53.529607 master-0 kubenswrapper[13046]: I0308 03:37:53.527375 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531039 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1f8575-4138-44d5-9be6-14a70bf8170c" containerName="keystone-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531075 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1f8575-4138-44d5-9be6-14a70bf8170c" containerName="keystone-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531096 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ace6ef2-a01f-4585-8282-c24e3d7a8246" containerName="glance-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531106 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ace6ef2-a01f-4585-8282-c24e3d7a8246" containerName="glance-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531116 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5eac99-34cd-4cdf-af11-d7475573518d" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531122 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5eac99-34cd-4cdf-af11-d7475573518d" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531147 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d02515-fb93-427a-9f9f-d97a1e68ec30" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531154 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d02515-fb93-427a-9f9f-d97a1e68ec30" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531168 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="init" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531174 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="init" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531190 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="dnsmasq-dns" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531195 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="dnsmasq-dns" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531208 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68358212-00fe-4d13-9c37-70c62360ead0" containerName="ovn-config" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531214 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="68358212-00fe-4d13-9c37-70c62360ead0" containerName="ovn-config" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531232 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7f97b4-52cc-4108-95c4-dc762cd1398a" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531238 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7f97b4-52cc-4108-95c4-dc762cd1398a" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: E0308 03:37:53.531260 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26dfae9-3c54-4102-8762-903c01f9eb23" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531266 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26dfae9-3c54-4102-8762-903c01f9eb23" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531454 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7f97b4-52cc-4108-95c4-dc762cd1398a" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531465 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26fb299-b4b3-4f84-acb9-82afd62a9c39" containerName="dnsmasq-dns" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.531491 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="68358212-00fe-4d13-9c37-70c62360ead0" containerName="ovn-config" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.535519 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1f8575-4138-44d5-9be6-14a70bf8170c" containerName="keystone-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.535550 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26dfae9-3c54-4102-8762-903c01f9eb23" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.535574 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ace6ef2-a01f-4585-8282-c24e3d7a8246" containerName="glance-db-sync" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.535607 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d02515-fb93-427a-9f9f-d97a1e68ec30" containerName="mariadb-account-create-update" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.535622 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5eac99-34cd-4cdf-af11-d7475573518d" containerName="mariadb-database-create" Mar 08 03:37:53.537542 master-0 kubenswrapper[13046]: I0308 03:37:53.537141 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.567536 master-0 kubenswrapper[13046]: I0308 03:37:53.564272 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.601384 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.601452 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.601553 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.601690 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.601856 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflcs\" (UniqueName: \"kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.602036 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.603512 master-0 kubenswrapper[13046]: I0308 03:37:53.602118 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47wqh\" (UniqueName: \"kubernetes.io/projected/7a1f8575-4138-44d5-9be6-14a70bf8170c-kube-api-access-47wqh\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:53.622161 master-0 kubenswrapper[13046]: I0308 03:37:53.622104 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a1f8575-4138-44d5-9be6-14a70bf8170c" (UID: "7a1f8575-4138-44d5-9be6-14a70bf8170c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:53.631618 master-0 kubenswrapper[13046]: I0308 03:37:53.631571 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data" (OuterVolumeSpecName: "config-data") pod "7a1f8575-4138-44d5-9be6-14a70bf8170c" (UID: "7a1f8575-4138-44d5-9be6-14a70bf8170c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:53.704340 master-0 kubenswrapper[13046]: I0308 03:37:53.704262 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704534 master-0 kubenswrapper[13046]: I0308 03:37:53.704391 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflcs\" (UniqueName: \"kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704534 master-0 kubenswrapper[13046]: I0308 03:37:53.704508 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704619 master-0 kubenswrapper[13046]: I0308 03:37:53.704564 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704652 master-0 kubenswrapper[13046]: I0308 03:37:53.704615 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704691 master-0 kubenswrapper[13046]: I0308 03:37:53.704647 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.704764 master-0 kubenswrapper[13046]: I0308 03:37:53.704739 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:53.704809 master-0 kubenswrapper[13046]: I0308 03:37:53.704765 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1f8575-4138-44d5-9be6-14a70bf8170c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:53.705878 master-0 kubenswrapper[13046]: I0308 03:37:53.705819 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.706210 master-0 kubenswrapper[13046]: I0308 03:37:53.706181 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.706337 master-0 kubenswrapper[13046]: I0308 03:37:53.706300 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.706394 master-0 kubenswrapper[13046]: I0308 03:37:53.706307 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.706826 master-0 kubenswrapper[13046]: I0308 03:37:53.706805 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.724065 master-0 kubenswrapper[13046]: I0308 03:37:53.724020 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflcs\" (UniqueName: \"kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs\") pod \"dnsmasq-dns-66867b49c-pw8r4\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:53.855939 master-0 kubenswrapper[13046]: I0308 03:37:53.855897 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2bt4j" event={"ID":"7a1f8575-4138-44d5-9be6-14a70bf8170c","Type":"ContainerDied","Data":"25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377"} Mar 08 03:37:53.856452 master-0 kubenswrapper[13046]: I0308 03:37:53.856436 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25336bb7082dd575cf7937c1ddae6aece2a12b57df9b42b5156b43b4e9882377" Mar 08 03:37:53.856633 master-0 kubenswrapper[13046]: I0308 03:37:53.855945 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2bt4j" Mar 08 03:37:53.978429 master-0 kubenswrapper[13046]: I0308 03:37:53.978337 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:54.277503 master-0 kubenswrapper[13046]: I0308 03:37:54.276880 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:54.302137 master-0 kubenswrapper[13046]: I0308 03:37:54.302097 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qv4vs"] Mar 08 03:37:54.304526 master-0 kubenswrapper[13046]: I0308 03:37:54.303947 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.313932 master-0 kubenswrapper[13046]: I0308 03:37:54.307517 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 03:37:54.313932 master-0 kubenswrapper[13046]: I0308 03:37:54.307698 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 03:37:54.313932 master-0 kubenswrapper[13046]: I0308 03:37:54.307825 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 03:37:54.313932 master-0 kubenswrapper[13046]: I0308 03:37:54.307922 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 03:37:54.313932 master-0 kubenswrapper[13046]: I0308 03:37:54.308626 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qv4vs"] Mar 08 03:37:54.379997 master-0 kubenswrapper[13046]: I0308 03:37:54.379911 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:54.382525 master-0 kubenswrapper[13046]: I0308 03:37:54.381762 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.417809 master-0 kubenswrapper[13046]: I0308 03:37:54.415661 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445012 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjzl\" (UniqueName: \"kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445064 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445119 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445141 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmw9j\" (UniqueName: \"kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445173 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445209 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445225 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445247 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445281 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445304 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445321 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.449444 master-0 kubenswrapper[13046]: I0308 03:37:54.445340 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.555911 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.555951 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.555980 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556015 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556045 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556061 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556081 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556115 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjzl\" (UniqueName: \"kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556142 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556216 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556455 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmw9j\" (UniqueName: \"kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.557426 master-0 kubenswrapper[13046]: I0308 03:37:54.556515 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.557940 master-0 kubenswrapper[13046]: I0308 03:37:54.557648 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.574184 master-0 kubenswrapper[13046]: I0308 03:37:54.563924 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.574184 master-0 kubenswrapper[13046]: I0308 03:37:54.567107 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.574184 master-0 kubenswrapper[13046]: I0308 03:37:54.568247 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.577877 master-0 kubenswrapper[13046]: W0308 03:37:54.577808 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64152fe8_511f_4cdf_ab75_923920cf3c56.slice/crio-6a8afa6193a78f534a4e9242734c328531f3c8293f3ffb9b2e87cb21eba744ad WatchSource:0}: Error finding container 6a8afa6193a78f534a4e9242734c328531f3c8293f3ffb9b2e87cb21eba744ad: Status 404 returned error can't find the container with id 6a8afa6193a78f534a4e9242734c328531f3c8293f3ffb9b2e87cb21eba744ad Mar 08 03:37:54.578693 master-0 kubenswrapper[13046]: I0308 03:37:54.578448 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.582611 master-0 kubenswrapper[13046]: I0308 03:37:54.581406 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.582985 master-0 kubenswrapper[13046]: I0308 03:37:54.582630 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.593190 master-0 kubenswrapper[13046]: I0308 03:37:54.592830 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.595246 master-0 kubenswrapper[13046]: I0308 03:37:54.595201 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-b2m2r"] Mar 08 03:37:54.602343 master-0 kubenswrapper[13046]: I0308 03:37:54.599618 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.604015 master-0 kubenswrapper[13046]: I0308 03:37:54.603976 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.610241 master-0 kubenswrapper[13046]: I0308 03:37:54.610184 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.630791 master-0 kubenswrapper[13046]: I0308 03:37:54.630742 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-b2m2r"] Mar 08 03:37:54.643857 master-0 kubenswrapper[13046]: I0308 03:37:54.642505 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmw9j\" (UniqueName: \"kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j\") pod \"keystone-bootstrap-qv4vs\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.661554 master-0 kubenswrapper[13046]: I0308 03:37:54.660908 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjzl\" (UniqueName: \"kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl\") pod \"dnsmasq-dns-74f6c9bdf-4zqjl\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.704037 master-0 kubenswrapper[13046]: I0308 03:37:54.703920 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-db-sync-xz5js"] Mar 08 03:37:54.707023 master-0 kubenswrapper[13046]: I0308 03:37:54.705233 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.726503 master-0 kubenswrapper[13046]: I0308 03:37:54.724069 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-scripts" Mar 08 03:37:54.726503 master-0 kubenswrapper[13046]: I0308 03:37:54.724292 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-config-data" Mar 08 03:37:54.738733 master-0 kubenswrapper[13046]: I0308 03:37:54.738701 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-db-sync-xz5js"] Mar 08 03:37:54.745285 master-0 kubenswrapper[13046]: I0308 03:37:54.744858 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.760815 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.760911 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.760942 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txz2\" (UniqueName: \"kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.760963 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.760997 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ggm\" (UniqueName: \"kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.761037 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.761057 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.761577 master-0 kubenswrapper[13046]: I0308 03:37:54.761079 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.771628 master-0 kubenswrapper[13046]: I0308 03:37:54.769108 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:54.823345 master-0 kubenswrapper[13046]: I0308 03:37:54.821374 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-k4mpt"] Mar 08 03:37:54.823345 master-0 kubenswrapper[13046]: I0308 03:37:54.823208 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:54.826590 master-0 kubenswrapper[13046]: I0308 03:37:54.824637 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:54.832565 master-0 kubenswrapper[13046]: I0308 03:37:54.827688 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 03:37:54.832565 master-0 kubenswrapper[13046]: I0308 03:37:54.827920 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865039 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865097 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865131 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865216 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865278 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865308 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txz2\" (UniqueName: \"kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865332 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.865494 master-0 kubenswrapper[13046]: I0308 03:37:54.865367 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ggm\" (UniqueName: \"kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.869275 master-0 kubenswrapper[13046]: I0308 03:37:54.867618 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.875144 master-0 kubenswrapper[13046]: I0308 03:37:54.874796 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.879773 master-0 kubenswrapper[13046]: I0308 03:37:54.879018 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.884516 master-0 kubenswrapper[13046]: I0308 03:37:54.881964 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7752-account-create-update-qwq4w"] Mar 08 03:37:54.884516 master-0 kubenswrapper[13046]: I0308 03:37:54.881979 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.887813 master-0 kubenswrapper[13046]: I0308 03:37:54.886563 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:54.887813 master-0 kubenswrapper[13046]: I0308 03:37:54.886763 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.891661 master-0 kubenswrapper[13046]: I0308 03:37:54.888657 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 08 03:37:54.897093 master-0 kubenswrapper[13046]: I0308 03:37:54.897051 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ggm\" (UniqueName: \"kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm\") pod \"ironic-db-create-b2m2r\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:54.907264 master-0 kubenswrapper[13046]: I0308 03:37:54.902438 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" event={"ID":"64152fe8-511f-4cdf-ab75-923920cf3c56","Type":"ContainerStarted","Data":"6a8afa6193a78f534a4e9242734c328531f3c8293f3ffb9b2e87cb21eba744ad"} Mar 08 03:37:54.912886 master-0 kubenswrapper[13046]: I0308 03:37:54.912297 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txz2\" (UniqueName: \"kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.919702 master-0 kubenswrapper[13046]: I0308 03:37:54.919664 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle\") pod \"cinder-e64dd-db-sync-xz5js\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:54.975998 master-0 kubenswrapper[13046]: I0308 03:37:54.975953 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:54.976559 master-0 kubenswrapper[13046]: I0308 03:37:54.976248 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:54.976559 master-0 kubenswrapper[13046]: I0308 03:37:54.976396 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgwdj\" (UniqueName: \"kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:54.976559 master-0 kubenswrapper[13046]: I0308 03:37:54.976429 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:54.976863 master-0 kubenswrapper[13046]: I0308 03:37:54.976794 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnr6b\" (UniqueName: \"kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:54.990671 master-0 kubenswrapper[13046]: I0308 03:37:54.990624 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k4mpt"] Mar 08 03:37:55.021402 master-0 kubenswrapper[13046]: I0308 03:37:55.018141 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7752-account-create-update-qwq4w"] Mar 08 03:37:55.038032 master-0 kubenswrapper[13046]: I0308 03:37:55.037472 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:55.047426 master-0 kubenswrapper[13046]: I0308 03:37:55.047375 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dzj84"] Mar 08 03:37:55.051248 master-0 kubenswrapper[13046]: I0308 03:37:55.051211 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.053921 master-0 kubenswrapper[13046]: I0308 03:37:55.053833 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 03:37:55.054490 master-0 kubenswrapper[13046]: I0308 03:37:55.054424 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 03:37:55.055258 master-0 kubenswrapper[13046]: I0308 03:37:55.055226 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:55.058013 master-0 kubenswrapper[13046]: I0308 03:37:55.056570 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzj84"] Mar 08 03:37:55.081662 master-0 kubenswrapper[13046]: I0308 03:37:55.078426 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.081662 master-0 kubenswrapper[13046]: I0308 03:37:55.078592 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.081662 master-0 kubenswrapper[13046]: I0308 03:37:55.078656 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwdj\" (UniqueName: \"kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:55.081662 master-0 kubenswrapper[13046]: I0308 03:37:55.078685 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:55.081662 master-0 kubenswrapper[13046]: I0308 03:37:55.078753 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnr6b\" (UniqueName: \"kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.090444 master-0 kubenswrapper[13046]: I0308 03:37:55.086696 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:37:55.090444 master-0 kubenswrapper[13046]: I0308 03:37:55.086840 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.098679 master-0 kubenswrapper[13046]: I0308 03:37:55.095261 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:55.103755 master-0 kubenswrapper[13046]: I0308 03:37:55.102116 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:37:55.109144 master-0 kubenswrapper[13046]: I0308 03:37:55.109114 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.109287 master-0 kubenswrapper[13046]: I0308 03:37:55.109197 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.109287 master-0 kubenswrapper[13046]: I0308 03:37:55.109109 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnr6b\" (UniqueName: \"kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b\") pod \"neutron-db-sync-k4mpt\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.119630 master-0 kubenswrapper[13046]: I0308 03:37:55.119191 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:37:55.129643 master-0 kubenswrapper[13046]: I0308 03:37:55.122857 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwdj\" (UniqueName: \"kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj\") pod \"ironic-7752-account-create-update-qwq4w\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.187786 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188133 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188219 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188263 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdnx\" (UniqueName: \"kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188319 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188340 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188408 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188494 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvcpw\" (UniqueName: \"kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188582 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188645 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.188665 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.191135 master-0 kubenswrapper[13046]: I0308 03:37:55.190764 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.225549 master-0 kubenswrapper[13046]: I0308 03:37:55.224035 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:55.292425 master-0 kubenswrapper[13046]: I0308 03:37:55.292396 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.292564 master-0 kubenswrapper[13046]: I0308 03:37:55.292451 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.292564 master-0 kubenswrapper[13046]: I0308 03:37:55.292526 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdnx\" (UniqueName: \"kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.292639 master-0 kubenswrapper[13046]: I0308 03:37:55.292565 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.292639 master-0 kubenswrapper[13046]: I0308 03:37:55.292583 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.292639 master-0 kubenswrapper[13046]: I0308 03:37:55.292617 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.292722 master-0 kubenswrapper[13046]: I0308 03:37:55.292653 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvcpw\" (UniqueName: \"kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.292722 master-0 kubenswrapper[13046]: I0308 03:37:55.292696 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.292778 master-0 kubenswrapper[13046]: I0308 03:37:55.292735 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.293629 master-0 kubenswrapper[13046]: I0308 03:37:55.293609 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.294064 master-0 kubenswrapper[13046]: I0308 03:37:55.293976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.294980 master-0 kubenswrapper[13046]: I0308 03:37:55.294947 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.295034 master-0 kubenswrapper[13046]: I0308 03:37:55.295008 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.295519 master-0 kubenswrapper[13046]: I0308 03:37:55.295426 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.296430 master-0 kubenswrapper[13046]: I0308 03:37:55.296261 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.296505 master-0 kubenswrapper[13046]: I0308 03:37:55.296426 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.298303 master-0 kubenswrapper[13046]: I0308 03:37:55.298273 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.301792 master-0 kubenswrapper[13046]: I0308 03:37:55.301130 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.305268 master-0 kubenswrapper[13046]: I0308 03:37:55.305234 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.306465 master-0 kubenswrapper[13046]: I0308 03:37:55.306304 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.310061 master-0 kubenswrapper[13046]: I0308 03:37:55.310031 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdnx\" (UniqueName: \"kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx\") pod \"dnsmasq-dns-68fb5c97f-jv5vb\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.316370 master-0 kubenswrapper[13046]: I0308 03:37:55.316263 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvcpw\" (UniqueName: \"kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw\") pod \"placement-db-sync-dzj84\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.342156 master-0 kubenswrapper[13046]: I0308 03:37:55.338858 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qv4vs"] Mar 08 03:37:55.411043 master-0 kubenswrapper[13046]: I0308 03:37:55.410655 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzj84" Mar 08 03:37:55.450910 master-0 kubenswrapper[13046]: I0308 03:37:55.448887 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:55.605365 master-0 kubenswrapper[13046]: I0308 03:37:55.605309 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:55.637407 master-0 kubenswrapper[13046]: W0308 03:37:55.636654 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027393bc_e01e_4cb9_ade9_e9512d032984.slice/crio-a8417ef880899f1be4ae4b8369156b3dfdd6d533475cce721d882143b6b36f2c WatchSource:0}: Error finding container a8417ef880899f1be4ae4b8369156b3dfdd6d533475cce721d882143b6b36f2c: Status 404 returned error can't find the container with id a8417ef880899f1be4ae4b8369156b3dfdd6d533475cce721d882143b6b36f2c Mar 08 03:37:55.805943 master-0 kubenswrapper[13046]: I0308 03:37:55.805394 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-b2m2r"] Mar 08 03:37:55.924949 master-0 kubenswrapper[13046]: I0308 03:37:55.924897 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" event={"ID":"027393bc-e01e-4cb9-ade9-e9512d032984","Type":"ContainerStarted","Data":"a8417ef880899f1be4ae4b8369156b3dfdd6d533475cce721d882143b6b36f2c"} Mar 08 03:37:55.926225 master-0 kubenswrapper[13046]: I0308 03:37:55.926133 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qv4vs" event={"ID":"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe","Type":"ContainerStarted","Data":"f35a05d83d76f6e69f3fa02ef2bbac803a8038ac5e36963b0ab1cbd58658e0e5"} Mar 08 03:37:55.926963 master-0 kubenswrapper[13046]: I0308 03:37:55.926935 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b2m2r" event={"ID":"3cf185b5-9d0e-48e6-a961-fba08aa4688a","Type":"ContainerStarted","Data":"49cf933ce61c0a0588fe38697c4b0bb47975ac2902f699da3b501c2748172e60"} Mar 08 03:37:55.928905 master-0 kubenswrapper[13046]: I0308 03:37:55.928846 13046 generic.go:334] "Generic (PLEG): container finished" podID="64152fe8-511f-4cdf-ab75-923920cf3c56" containerID="2047bc713c2a4c576e18ee69df3fbd48c10551655a7d4bd93f9e966d5b908dcb" exitCode=0 Mar 08 03:37:55.929566 master-0 kubenswrapper[13046]: I0308 03:37:55.929278 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" event={"ID":"64152fe8-511f-4cdf-ab75-923920cf3c56","Type":"ContainerDied","Data":"2047bc713c2a4c576e18ee69df3fbd48c10551655a7d4bd93f9e966d5b908dcb"} Mar 08 03:37:55.964114 master-0 kubenswrapper[13046]: I0308 03:37:55.964036 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k4mpt"] Mar 08 03:37:56.325614 master-0 kubenswrapper[13046]: I0308 03:37:56.325576 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-db-sync-xz5js"] Mar 08 03:37:56.338372 master-0 kubenswrapper[13046]: I0308 03:37:56.338317 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7752-account-create-update-qwq4w"] Mar 08 03:37:56.346388 master-0 kubenswrapper[13046]: W0308 03:37:56.346356 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod888c100e_3bc9_45fa_a5a2_fe687ee09c1c.slice/crio-e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0 WatchSource:0}: Error finding container e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0: Status 404 returned error can't find the container with id e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0 Mar 08 03:37:56.463128 master-0 kubenswrapper[13046]: I0308 03:37:56.463067 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:37:56.467455 master-0 kubenswrapper[13046]: I0308 03:37:56.467414 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.470936 master-0 kubenswrapper[13046]: I0308 03:37:56.470906 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 03:37:56.472879 master-0 kubenswrapper[13046]: I0308 03:37:56.472805 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-external-config-data" Mar 08 03:37:56.539617 master-0 kubenswrapper[13046]: I0308 03:37:56.523955 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.647970 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648027 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648056 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648090 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648112 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648146 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.654503 master-0 kubenswrapper[13046]: I0308 03:37:56.648163 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7prf\" (UniqueName: \"kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.687021 master-0 kubenswrapper[13046]: I0308 03:37:56.686965 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dzj84"] Mar 08 03:37:56.718556 master-0 kubenswrapper[13046]: I0308 03:37:56.718511 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766119 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766175 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766201 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766231 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766250 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766283 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.766304 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7prf\" (UniqueName: \"kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.767083 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.769109 master-0 kubenswrapper[13046]: I0308 03:37:56.767294 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.792560 master-0 kubenswrapper[13046]: I0308 03:37:56.789972 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.792560 master-0 kubenswrapper[13046]: I0308 03:37:56.791798 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:37:56.792560 master-0 kubenswrapper[13046]: I0308 03:37:56.791840 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7ee56d67d41b92c054773f64bf7346771894ec4c4c18aa0117ea2564b4c6d4a8/globalmount\"" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.797436 master-0 kubenswrapper[13046]: I0308 03:37:56.797227 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:56.801224 master-0 kubenswrapper[13046]: I0308 03:37:56.801183 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.803456 master-0 kubenswrapper[13046]: I0308 03:37:56.803413 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7prf\" (UniqueName: \"kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.814609 master-0 kubenswrapper[13046]: I0308 03:37:56.814063 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:56.953301 master-0 kubenswrapper[13046]: I0308 03:37:56.953239 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k4mpt" event={"ID":"9fa4b3ae-ec1e-4819-a483-12a563171db2","Type":"ContainerStarted","Data":"4d71bda5b031b2d9a24e5b6a59a0fc20622fc76fecb76d2cacb2685b9dfd5b5b"} Mar 08 03:37:56.953301 master-0 kubenswrapper[13046]: I0308 03:37:56.953294 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k4mpt" event={"ID":"9fa4b3ae-ec1e-4819-a483-12a563171db2","Type":"ContainerStarted","Data":"92492e587a4284a9aea4e12634213fc0832fdd3922361c4957c2868169c96b08"} Mar 08 03:37:56.954934 master-0 kubenswrapper[13046]: I0308 03:37:56.954617 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7752-account-create-update-qwq4w" event={"ID":"f6b0006e-a04e-4dcb-a516-b6d02385a494","Type":"ContainerStarted","Data":"a5ccc39beb1396e690572ed31c100f656d40814a7144ef2842acc64325078985"} Mar 08 03:37:56.956275 master-0 kubenswrapper[13046]: I0308 03:37:56.956234 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzj84" event={"ID":"2f71b2d7-115f-473a-9427-8af24a1a7467","Type":"ContainerStarted","Data":"ba6576b1c78c76016bf5c2d6723276cd4aef7235fbe486054e12e0390c9b0b66"} Mar 08 03:37:56.957718 master-0 kubenswrapper[13046]: I0308 03:37:56.957668 13046 generic.go:334] "Generic (PLEG): container finished" podID="027393bc-e01e-4cb9-ade9-e9512d032984" containerID="ec4b5090fcc9e884948731710e0636a0d85da8d133697044f085e8882247bde5" exitCode=0 Mar 08 03:37:56.957718 master-0 kubenswrapper[13046]: I0308 03:37:56.957686 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" event={"ID":"027393bc-e01e-4cb9-ade9-e9512d032984","Type":"ContainerDied","Data":"ec4b5090fcc9e884948731710e0636a0d85da8d133697044f085e8882247bde5"} Mar 08 03:37:56.961962 master-0 kubenswrapper[13046]: I0308 03:37:56.961928 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" event={"ID":"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37","Type":"ContainerStarted","Data":"41c14e926ac9ee355bba3f68f69c7c33c04730468b1ab386f389d53703c637a7"} Mar 08 03:37:56.964950 master-0 kubenswrapper[13046]: I0308 03:37:56.964925 13046 generic.go:334] "Generic (PLEG): container finished" podID="3cf185b5-9d0e-48e6-a961-fba08aa4688a" containerID="a5b5a62d52c2d150f9d40c3c5dcbd8b35145551c7dc57bbf98937aa98ee8cd13" exitCode=0 Mar 08 03:37:56.965025 master-0 kubenswrapper[13046]: I0308 03:37:56.964977 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b2m2r" event={"ID":"3cf185b5-9d0e-48e6-a961-fba08aa4688a","Type":"ContainerDied","Data":"a5b5a62d52c2d150f9d40c3c5dcbd8b35145551c7dc57bbf98937aa98ee8cd13"} Mar 08 03:37:56.967330 master-0 kubenswrapper[13046]: I0308 03:37:56.967255 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-db-sync-xz5js" event={"ID":"888c100e-3bc9-45fa-a5a2-fe687ee09c1c","Type":"ContainerStarted","Data":"e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0"} Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.977031 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" event={"ID":"64152fe8-511f-4cdf-ab75-923920cf3c56","Type":"ContainerDied","Data":"6a8afa6193a78f534a4e9242734c328531f3c8293f3ffb9b2e87cb21eba744ad"} Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.977089 13046 scope.go:117] "RemoveContainer" containerID="2047bc713c2a4c576e18ee69df3fbd48c10551655a7d4bd93f9e966d5b908dcb" Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.977203 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66867b49c-pw8r4" Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982244 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982328 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982401 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pflcs\" (UniqueName: \"kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982427 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982496 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986044 master-0 kubenswrapper[13046]: I0308 03:37:56.982603 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc\") pod \"64152fe8-511f-4cdf-ab75-923920cf3c56\" (UID: \"64152fe8-511f-4cdf-ab75-923920cf3c56\") " Mar 08 03:37:56.986736 master-0 kubenswrapper[13046]: I0308 03:37:56.986678 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qv4vs" event={"ID":"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe","Type":"ContainerStarted","Data":"b1d28aefb2d4d0f07983029a4b00c727be020bd6d0e72d488ab98cca65d32172"} Mar 08 03:37:57.017105 master-0 kubenswrapper[13046]: I0308 03:37:57.016932 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs" (OuterVolumeSpecName: "kube-api-access-pflcs") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "kube-api-access-pflcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:57.017503 master-0 kubenswrapper[13046]: I0308 03:37:57.017252 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.022800 master-0 kubenswrapper[13046]: I0308 03:37:57.022736 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-k4mpt" podStartSLOduration=3.022716197 podStartE2EDuration="3.022716197s" podCreationTimestamp="2026-03-08 03:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:56.99360373 +0000 UTC m=+1479.072370947" watchObservedRunningTime="2026-03-08 03:37:57.022716197 +0000 UTC m=+1479.101483414" Mar 08 03:37:57.039520 master-0 kubenswrapper[13046]: I0308 03:37:57.039419 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.049392 master-0 kubenswrapper[13046]: I0308 03:37:57.048831 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config" (OuterVolumeSpecName: "config") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.086090 master-0 kubenswrapper[13046]: I0308 03:37:57.086040 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.086090 master-0 kubenswrapper[13046]: I0308 03:37:57.086089 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.086266 master-0 kubenswrapper[13046]: I0308 03:37:57.086104 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pflcs\" (UniqueName: \"kubernetes.io/projected/64152fe8-511f-4cdf-ab75-923920cf3c56-kube-api-access-pflcs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.086266 master-0 kubenswrapper[13046]: I0308 03:37:57.086120 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.113219 master-0 kubenswrapper[13046]: I0308 03:37:57.113171 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.131096 master-0 kubenswrapper[13046]: I0308 03:37:57.119029 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64152fe8-511f-4cdf-ab75-923920cf3c56" (UID: "64152fe8-511f-4cdf-ab75-923920cf3c56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.172490 master-0 kubenswrapper[13046]: I0308 03:37:57.172394 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qv4vs" podStartSLOduration=3.17237168 podStartE2EDuration="3.17237168s" podCreationTimestamp="2026-03-08 03:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:57.158704741 +0000 UTC m=+1479.237471958" watchObservedRunningTime="2026-03-08 03:37:57.17237168 +0000 UTC m=+1479.251138897" Mar 08 03:37:57.190733 master-0 kubenswrapper[13046]: I0308 03:37:57.190350 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.190733 master-0 kubenswrapper[13046]: I0308 03:37:57.190391 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64152fe8-511f-4cdf-ab75-923920cf3c56-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.269711 master-0 kubenswrapper[13046]: I0308 03:37:57.269226 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:37:57.289517 master-0 kubenswrapper[13046]: E0308 03:37:57.273540 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-bf784-default-external-api-0" podUID="64de8d37-91d1-4ee9-8067-4a9bac68aa8c" Mar 08 03:37:57.410696 master-0 kubenswrapper[13046]: I0308 03:37:57.410599 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:57.433496 master-0 kubenswrapper[13046]: I0308 03:37:57.433423 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66867b49c-pw8r4"] Mar 08 03:37:57.446243 master-0 kubenswrapper[13046]: I0308 03:37:57.446151 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:37:57.446788 master-0 kubenswrapper[13046]: E0308 03:37:57.446768 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64152fe8-511f-4cdf-ab75-923920cf3c56" containerName="init" Mar 08 03:37:57.446788 master-0 kubenswrapper[13046]: I0308 03:37:57.446788 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="64152fe8-511f-4cdf-ab75-923920cf3c56" containerName="init" Mar 08 03:37:57.447037 master-0 kubenswrapper[13046]: I0308 03:37:57.447020 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="64152fe8-511f-4cdf-ab75-923920cf3c56" containerName="init" Mar 08 03:37:57.448185 master-0 kubenswrapper[13046]: I0308 03:37:57.448109 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.453978 master-0 kubenswrapper[13046]: I0308 03:37:57.451831 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-internal-config-data" Mar 08 03:37:57.457146 master-0 kubenswrapper[13046]: I0308 03:37:57.457007 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:37:57.500305 master-0 kubenswrapper[13046]: I0308 03:37:57.500227 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:57.629458 master-0 kubenswrapper[13046]: I0308 03:37:57.629408 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjzl\" (UniqueName: \"kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.629688 master-0 kubenswrapper[13046]: I0308 03:37:57.629533 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.629688 master-0 kubenswrapper[13046]: I0308 03:37:57.629585 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.629688 master-0 kubenswrapper[13046]: I0308 03:37:57.629603 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.629688 master-0 kubenswrapper[13046]: I0308 03:37:57.629641 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.629688 master-0 kubenswrapper[13046]: I0308 03:37:57.629662 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.630039 master-0 kubenswrapper[13046]: I0308 03:37:57.629991 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kddb\" (UniqueName: \"kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630039 master-0 kubenswrapper[13046]: I0308 03:37:57.630027 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630116 master-0 kubenswrapper[13046]: I0308 03:37:57.630108 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630148 master-0 kubenswrapper[13046]: I0308 03:37:57.630130 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630242 master-0 kubenswrapper[13046]: I0308 03:37:57.630198 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630286 master-0 kubenswrapper[13046]: I0308 03:37:57.630249 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.630339 master-0 kubenswrapper[13046]: I0308 03:37:57.630322 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.648145 master-0 kubenswrapper[13046]: I0308 03:37:57.636155 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl" (OuterVolumeSpecName: "kube-api-access-wcjzl") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "kube-api-access-wcjzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:57.691839 master-0 kubenswrapper[13046]: I0308 03:37:57.687034 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.710307 master-0 kubenswrapper[13046]: I0308 03:37:57.701107 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.710307 master-0 kubenswrapper[13046]: I0308 03:37:57.704639 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.733570 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config" (OuterVolumeSpecName: "config") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.733899 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") pod \"027393bc-e01e-4cb9-ade9-e9512d032984\" (UID: \"027393bc-e01e-4cb9-ade9-e9512d032984\") " Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: W0308 03:37:57.734021 13046 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/027393bc-e01e-4cb9-ade9-e9512d032984/volumes/kubernetes.io~configmap/config Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.734033 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config" (OuterVolumeSpecName: "config") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.734422 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.734451 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kddb\" (UniqueName: \"kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.734655 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735035 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735086 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735339 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735379 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735604 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcjzl\" (UniqueName: \"kubernetes.io/projected/027393bc-e01e-4cb9-ade9-e9512d032984-kube-api-access-wcjzl\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735646 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.735682 master-0 kubenswrapper[13046]: I0308 03:37:57.735666 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.735863 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.737633 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.737662 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.737672 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.738803 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:37:57.740003 master-0 kubenswrapper[13046]: I0308 03:37:57.738824 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/46958aa2ee719eeb05ed2819dd5b5bd381312e73d29f62f1a59eb590a4eaa799/globalmount\"" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.742584 master-0 kubenswrapper[13046]: I0308 03:37:57.742412 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.746154 master-0 kubenswrapper[13046]: I0308 03:37:57.744895 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.748363 master-0 kubenswrapper[13046]: I0308 03:37:57.748143 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "027393bc-e01e-4cb9-ade9-e9512d032984" (UID: "027393bc-e01e-4cb9-ade9-e9512d032984"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:57.766028 master-0 kubenswrapper[13046]: I0308 03:37:57.750373 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.772519 master-0 kubenswrapper[13046]: I0308 03:37:57.767000 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kddb\" (UniqueName: \"kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:37:57.839443 master-0 kubenswrapper[13046]: I0308 03:37:57.839386 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/027393bc-e01e-4cb9-ade9-e9512d032984-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.006833 master-0 kubenswrapper[13046]: I0308 03:37:58.006785 13046 generic.go:334] "Generic (PLEG): container finished" podID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerID="ef4fadf57b9bdef465be4bb304a7237f4ebcca696a585847ea78adfd97a00bf7" exitCode=0 Mar 08 03:37:58.007468 master-0 kubenswrapper[13046]: I0308 03:37:58.006876 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" event={"ID":"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37","Type":"ContainerDied","Data":"ef4fadf57b9bdef465be4bb304a7237f4ebcca696a585847ea78adfd97a00bf7"} Mar 08 03:37:58.010778 master-0 kubenswrapper[13046]: I0308 03:37:58.010238 13046 generic.go:334] "Generic (PLEG): container finished" podID="f6b0006e-a04e-4dcb-a516-b6d02385a494" containerID="1bcbec65c1a02bf8dcc7269e365f626ba1fcfc51e07bbf539eddcee7a1c8e922" exitCode=0 Mar 08 03:37:58.010778 master-0 kubenswrapper[13046]: I0308 03:37:58.010594 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7752-account-create-update-qwq4w" event={"ID":"f6b0006e-a04e-4dcb-a516-b6d02385a494","Type":"ContainerDied","Data":"1bcbec65c1a02bf8dcc7269e365f626ba1fcfc51e07bbf539eddcee7a1c8e922"} Mar 08 03:37:58.024220 master-0 kubenswrapper[13046]: I0308 03:37:58.024131 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:58.025245 master-0 kubenswrapper[13046]: I0308 03:37:58.025101 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" event={"ID":"027393bc-e01e-4cb9-ade9-e9512d032984","Type":"ContainerDied","Data":"a8417ef880899f1be4ae4b8369156b3dfdd6d533475cce721d882143b6b36f2c"} Mar 08 03:37:58.025245 master-0 kubenswrapper[13046]: I0308 03:37:58.025149 13046 scope.go:117] "RemoveContainer" containerID="ec4b5090fcc9e884948731710e0636a0d85da8d133697044f085e8882247bde5" Mar 08 03:37:58.026240 master-0 kubenswrapper[13046]: I0308 03:37:58.026169 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6c9bdf-4zqjl" Mar 08 03:37:58.089819 master-0 kubenswrapper[13046]: I0308 03:37:58.089413 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:58.257126 master-0 kubenswrapper[13046]: I0308 03:37:58.256990 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64152fe8-511f-4cdf-ab75-923920cf3c56" path="/var/lib/kubelet/pods/64152fe8-511f-4cdf-ab75-923920cf3c56/volumes" Mar 08 03:37:58.353573 master-0 kubenswrapper[13046]: I0308 03:37:58.344367 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.356960 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357125 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357151 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7prf\" (UniqueName: \"kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357190 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357225 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357363 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.357684 master-0 kubenswrapper[13046]: I0308 03:37:58.357382 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts\") pod \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\" (UID: \"64de8d37-91d1-4ee9-8067-4a9bac68aa8c\") " Mar 08 03:37:58.372057 master-0 kubenswrapper[13046]: I0308 03:37:58.372024 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs" (OuterVolumeSpecName: "logs") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:37:58.373009 master-0 kubenswrapper[13046]: I0308 03:37:58.372964 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf" (OuterVolumeSpecName: "kube-api-access-w7prf") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "kube-api-access-w7prf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:58.391468 master-0 kubenswrapper[13046]: I0308 03:37:58.391414 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data" (OuterVolumeSpecName: "config-data") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:58.391582 master-0 kubenswrapper[13046]: I0308 03:37:58.391459 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:58.400564 master-0 kubenswrapper[13046]: I0308 03:37:58.399691 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts" (OuterVolumeSpecName: "scripts") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:37:58.401832 master-0 kubenswrapper[13046]: I0308 03:37:58.401794 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:37:58.437984 master-0 kubenswrapper[13046]: I0308 03:37:58.435198 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459433 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459469 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459497 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459507 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7prf\" (UniqueName: \"kubernetes.io/projected/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-kube-api-access-w7prf\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459515 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.461011 master-0 kubenswrapper[13046]: I0308 03:37:58.459523 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de8d37-91d1-4ee9-8067-4a9bac68aa8c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.517421 master-0 kubenswrapper[13046]: I0308 03:37:58.517383 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6c9bdf-4zqjl"] Mar 08 03:37:58.638435 master-0 kubenswrapper[13046]: I0308 03:37:58.638135 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:58.668615 master-0 kubenswrapper[13046]: I0308 03:37:58.666842 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5ggm\" (UniqueName: \"kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm\") pod \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " Mar 08 03:37:58.668615 master-0 kubenswrapper[13046]: I0308 03:37:58.667096 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts\") pod \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\" (UID: \"3cf185b5-9d0e-48e6-a961-fba08aa4688a\") " Mar 08 03:37:58.668615 master-0 kubenswrapper[13046]: I0308 03:37:58.668606 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cf185b5-9d0e-48e6-a961-fba08aa4688a" (UID: "3cf185b5-9d0e-48e6-a961-fba08aa4688a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:58.689528 master-0 kubenswrapper[13046]: I0308 03:37:58.687131 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm" (OuterVolumeSpecName: "kube-api-access-r5ggm") pod "3cf185b5-9d0e-48e6-a961-fba08aa4688a" (UID: "3cf185b5-9d0e-48e6-a961-fba08aa4688a"). InnerVolumeSpecName "kube-api-access-r5ggm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:58.771586 master-0 kubenswrapper[13046]: I0308 03:37:58.771457 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5ggm\" (UniqueName: \"kubernetes.io/projected/3cf185b5-9d0e-48e6-a961-fba08aa4688a-kube-api-access-r5ggm\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:58.771586 master-0 kubenswrapper[13046]: I0308 03:37:58.771507 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cf185b5-9d0e-48e6-a961-fba08aa4688a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:59.043068 master-0 kubenswrapper[13046]: I0308 03:37:59.042942 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" event={"ID":"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37","Type":"ContainerStarted","Data":"3e6d6d7f08325e7296da53a6fb4b5387b33fb898f4e52f930fd04a5dc0514c5c"} Mar 08 03:37:59.043575 master-0 kubenswrapper[13046]: I0308 03:37:59.043550 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:37:59.046460 master-0 kubenswrapper[13046]: I0308 03:37:59.046419 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b2m2r" Mar 08 03:37:59.048587 master-0 kubenswrapper[13046]: I0308 03:37:59.047894 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b2m2r" event={"ID":"3cf185b5-9d0e-48e6-a961-fba08aa4688a","Type":"ContainerDied","Data":"49cf933ce61c0a0588fe38697c4b0bb47975ac2902f699da3b501c2748172e60"} Mar 08 03:37:59.048587 master-0 kubenswrapper[13046]: I0308 03:37:59.047952 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49cf933ce61c0a0588fe38697c4b0bb47975ac2902f699da3b501c2748172e60" Mar 08 03:37:59.048587 master-0 kubenswrapper[13046]: I0308 03:37:59.048053 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:37:59.108304 master-0 kubenswrapper[13046]: I0308 03:37:59.106169 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" podStartSLOduration=5.10591272 podStartE2EDuration="5.10591272s" podCreationTimestamp="2026-03-08 03:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:37:59.070678359 +0000 UTC m=+1481.149445586" watchObservedRunningTime="2026-03-08 03:37:59.10591272 +0000 UTC m=+1481.184679967" Mar 08 03:37:59.570655 master-0 kubenswrapper[13046]: I0308 03:37:59.570230 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:37:59.612766 master-0 kubenswrapper[13046]: I0308 03:37:59.599443 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgwdj\" (UniqueName: \"kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj\") pod \"f6b0006e-a04e-4dcb-a516-b6d02385a494\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " Mar 08 03:37:59.612766 master-0 kubenswrapper[13046]: I0308 03:37:59.599548 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts\") pod \"f6b0006e-a04e-4dcb-a516-b6d02385a494\" (UID: \"f6b0006e-a04e-4dcb-a516-b6d02385a494\") " Mar 08 03:37:59.612766 master-0 kubenswrapper[13046]: I0308 03:37:59.600563 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6b0006e-a04e-4dcb-a516-b6d02385a494" (UID: "f6b0006e-a04e-4dcb-a516-b6d02385a494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:37:59.612766 master-0 kubenswrapper[13046]: I0308 03:37:59.611083 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj" (OuterVolumeSpecName: "kube-api-access-dgwdj") pod "f6b0006e-a04e-4dcb-a516-b6d02385a494" (UID: "f6b0006e-a04e-4dcb-a516-b6d02385a494"). InnerVolumeSpecName "kube-api-access-dgwdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:37:59.702582 master-0 kubenswrapper[13046]: I0308 03:37:59.702516 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgwdj\" (UniqueName: \"kubernetes.io/projected/f6b0006e-a04e-4dcb-a516-b6d02385a494-kube-api-access-dgwdj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:37:59.702582 master-0 kubenswrapper[13046]: I0308 03:37:59.702557 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6b0006e-a04e-4dcb-a516-b6d02385a494-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:00.059236 master-0 kubenswrapper[13046]: I0308 03:38:00.059169 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7752-account-create-update-qwq4w" event={"ID":"f6b0006e-a04e-4dcb-a516-b6d02385a494","Type":"ContainerDied","Data":"a5ccc39beb1396e690572ed31c100f656d40814a7144ef2842acc64325078985"} Mar 08 03:38:00.059236 master-0 kubenswrapper[13046]: I0308 03:38:00.059221 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ccc39beb1396e690572ed31c100f656d40814a7144ef2842acc64325078985" Mar 08 03:38:00.059236 master-0 kubenswrapper[13046]: I0308 03:38:00.059191 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7752-account-create-update-qwq4w" Mar 08 03:38:00.103996 master-0 kubenswrapper[13046]: I0308 03:38:00.103951 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82" (OuterVolumeSpecName: "glance") pod "64de8d37-91d1-4ee9-8067-4a9bac68aa8c" (UID: "64de8d37-91d1-4ee9-8067-4a9bac68aa8c"). InnerVolumeSpecName "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 03:38:00.106705 master-0 kubenswrapper[13046]: I0308 03:38:00.106643 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:00.112528 master-0 kubenswrapper[13046]: I0308 03:38:00.112468 13046 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" " Mar 08 03:38:00.136326 master-0 kubenswrapper[13046]: I0308 03:38:00.136268 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027393bc-e01e-4cb9-ade9-e9512d032984" path="/var/lib/kubelet/pods/027393bc-e01e-4cb9-ade9-e9512d032984/volumes" Mar 08 03:38:00.141105 master-0 kubenswrapper[13046]: I0308 03:38:00.140445 13046 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 03:38:00.141105 master-0 kubenswrapper[13046]: I0308 03:38:00.140601 13046 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd" (UniqueName: "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82") on node "master-0" Mar 08 03:38:00.168355 master-0 kubenswrapper[13046]: I0308 03:38:00.168303 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:00.214740 master-0 kubenswrapper[13046]: I0308 03:38:00.214501 13046 reconciler_common.go:293] "Volume detached for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:00.345521 master-0 kubenswrapper[13046]: I0308 03:38:00.339186 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:00.372737 master-0 kubenswrapper[13046]: I0308 03:38:00.364082 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.393314 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: E0308 03:38:00.394747 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027393bc-e01e-4cb9-ade9-e9512d032984" containerName="init" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.394765 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="027393bc-e01e-4cb9-ade9-e9512d032984" containerName="init" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: E0308 03:38:00.394791 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b0006e-a04e-4dcb-a516-b6d02385a494" containerName="mariadb-account-create-update" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.394797 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b0006e-a04e-4dcb-a516-b6d02385a494" containerName="mariadb-account-create-update" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: E0308 03:38:00.394831 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf185b5-9d0e-48e6-a961-fba08aa4688a" containerName="mariadb-database-create" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.394838 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf185b5-9d0e-48e6-a961-fba08aa4688a" containerName="mariadb-database-create" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.395118 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf185b5-9d0e-48e6-a961-fba08aa4688a" containerName="mariadb-database-create" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.395155 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="027393bc-e01e-4cb9-ade9-e9512d032984" containerName="init" Mar 08 03:38:00.396094 master-0 kubenswrapper[13046]: I0308 03:38:00.395270 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b0006e-a04e-4dcb-a516-b6d02385a494" containerName="mariadb-account-create-update" Mar 08 03:38:00.401256 master-0 kubenswrapper[13046]: I0308 03:38:00.396838 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.401921 master-0 kubenswrapper[13046]: I0308 03:38:00.401882 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-external-config-data" Mar 08 03:38:00.409048 master-0 kubenswrapper[13046]: I0308 03:38:00.408243 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:00.523932 master-0 kubenswrapper[13046]: I0308 03:38:00.523855 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.525751 master-0 kubenswrapper[13046]: I0308 03:38:00.525683 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.528208 master-0 kubenswrapper[13046]: I0308 03:38:00.526585 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7tj\" (UniqueName: \"kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.528208 master-0 kubenswrapper[13046]: I0308 03:38:00.526671 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.528208 master-0 kubenswrapper[13046]: I0308 03:38:00.526754 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.528208 master-0 kubenswrapper[13046]: I0308 03:38:00.526920 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.528208 master-0 kubenswrapper[13046]: I0308 03:38:00.527031 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629411 master-0 kubenswrapper[13046]: I0308 03:38:00.629347 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629653 master-0 kubenswrapper[13046]: I0308 03:38:00.629435 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629653 master-0 kubenswrapper[13046]: I0308 03:38:00.629560 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629653 master-0 kubenswrapper[13046]: I0308 03:38:00.629634 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629800 master-0 kubenswrapper[13046]: I0308 03:38:00.629722 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629800 master-0 kubenswrapper[13046]: I0308 03:38:00.629784 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629888 master-0 kubenswrapper[13046]: I0308 03:38:00.629817 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7tj\" (UniqueName: \"kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.629962 master-0 kubenswrapper[13046]: I0308 03:38:00.629920 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.630564 master-0 kubenswrapper[13046]: I0308 03:38:00.630541 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.633170 master-0 kubenswrapper[13046]: I0308 03:38:00.633127 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:38:00.633265 master-0 kubenswrapper[13046]: I0308 03:38:00.633194 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7ee56d67d41b92c054773f64bf7346771894ec4c4c18aa0117ea2564b4c6d4a8/globalmount\"" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.637362 master-0 kubenswrapper[13046]: I0308 03:38:00.637327 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.637528 master-0 kubenswrapper[13046]: I0308 03:38:00.637465 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.638079 master-0 kubenswrapper[13046]: I0308 03:38:00.638042 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:00.646333 master-0 kubenswrapper[13046]: I0308 03:38:00.646290 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7tj\" (UniqueName: \"kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:01.987055 master-0 kubenswrapper[13046]: I0308 03:38:01.985308 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:02.092781 master-0 kubenswrapper[13046]: I0308 03:38:02.092733 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzj84" event={"ID":"2f71b2d7-115f-473a-9427-8af24a1a7467","Type":"ContainerStarted","Data":"5f2feb18084b6b8394640bdd39cec82ea0afd0d4e5afeea24398dbf8c76cceb1"} Mar 08 03:38:02.098742 master-0 kubenswrapper[13046]: I0308 03:38:02.098693 13046 generic.go:334] "Generic (PLEG): container finished" podID="cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" containerID="b1d28aefb2d4d0f07983029a4b00c727be020bd6d0e72d488ab98cca65d32172" exitCode=0 Mar 08 03:38:02.098925 master-0 kubenswrapper[13046]: I0308 03:38:02.098749 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qv4vs" event={"ID":"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe","Type":"ContainerDied","Data":"b1d28aefb2d4d0f07983029a4b00c727be020bd6d0e72d488ab98cca65d32172"} Mar 08 03:38:02.116899 master-0 kubenswrapper[13046]: I0308 03:38:02.116820 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dzj84" podStartSLOduration=3.074684032 podStartE2EDuration="8.116802183s" podCreationTimestamp="2026-03-08 03:37:54 +0000 UTC" firstStartedPulling="2026-03-08 03:37:56.777635193 +0000 UTC m=+1478.856402410" lastFinishedPulling="2026-03-08 03:38:01.819753324 +0000 UTC m=+1483.898520561" observedRunningTime="2026-03-08 03:38:02.11104455 +0000 UTC m=+1484.189811767" watchObservedRunningTime="2026-03-08 03:38:02.116802183 +0000 UTC m=+1484.195569400" Mar 08 03:38:02.148256 master-0 kubenswrapper[13046]: I0308 03:38:02.148212 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64de8d37-91d1-4ee9-8067-4a9bac68aa8c" path="/var/lib/kubelet/pods/64de8d37-91d1-4ee9-8067-4a9bac68aa8c/volumes" Mar 08 03:38:02.218157 master-0 kubenswrapper[13046]: I0308 03:38:02.218093 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:02.419926 master-0 kubenswrapper[13046]: I0308 03:38:02.419626 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:02.880312 master-0 kubenswrapper[13046]: I0308 03:38:02.880247 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:03.136109 master-0 kubenswrapper[13046]: I0308 03:38:03.136052 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerStarted","Data":"c73e925b670a49236a550646f1834b2e0477a04dc5a0290f4b497795cd29cf3f"} Mar 08 03:38:03.136109 master-0 kubenswrapper[13046]: I0308 03:38:03.136118 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerStarted","Data":"a858379d964d6a8d4447050f270f02e72632c117d2b426eb64a2497cf21a8c5c"} Mar 08 03:38:03.144002 master-0 kubenswrapper[13046]: I0308 03:38:03.143958 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerStarted","Data":"854cba4180d2afe3d0770e4934208ffcfe57fcb19ff9390fd25279aadcd1523b"} Mar 08 03:38:03.533323 master-0 kubenswrapper[13046]: I0308 03:38:03.533280 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:38:03.631918 master-0 kubenswrapper[13046]: I0308 03:38:03.631852 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.632128 master-0 kubenswrapper[13046]: I0308 03:38:03.632021 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmw9j\" (UniqueName: \"kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.632128 master-0 kubenswrapper[13046]: I0308 03:38:03.632090 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.632128 master-0 kubenswrapper[13046]: I0308 03:38:03.632116 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.632224 master-0 kubenswrapper[13046]: I0308 03:38:03.632150 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.632276 master-0 kubenswrapper[13046]: I0308 03:38:03.632260 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle\") pod \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\" (UID: \"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe\") " Mar 08 03:38:03.639695 master-0 kubenswrapper[13046]: I0308 03:38:03.639599 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:03.643384 master-0 kubenswrapper[13046]: I0308 03:38:03.643318 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j" (OuterVolumeSpecName: "kube-api-access-gmw9j") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "kube-api-access-gmw9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:03.647382 master-0 kubenswrapper[13046]: I0308 03:38:03.647251 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:03.650395 master-0 kubenswrapper[13046]: I0308 03:38:03.650356 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts" (OuterVolumeSpecName: "scripts") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:03.666390 master-0 kubenswrapper[13046]: I0308 03:38:03.666278 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data" (OuterVolumeSpecName: "config-data") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:03.670282 master-0 kubenswrapper[13046]: I0308 03:38:03.670229 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" (UID: "cfb042fa-8f65-4e21-8d8e-d69c422e3cfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:03.737145 master-0 kubenswrapper[13046]: I0308 03:38:03.737095 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmw9j\" (UniqueName: \"kubernetes.io/projected/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-kube-api-access-gmw9j\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.737145 master-0 kubenswrapper[13046]: I0308 03:38:03.737145 13046 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.737145 master-0 kubenswrapper[13046]: I0308 03:38:03.737158 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.737145 master-0 kubenswrapper[13046]: I0308 03:38:03.737169 13046 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.737145 master-0 kubenswrapper[13046]: I0308 03:38:03.737181 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.737813 master-0 kubenswrapper[13046]: I0308 03:38:03.737192 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:03.895747 master-0 kubenswrapper[13046]: I0308 03:38:03.895678 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:04.002526 master-0 kubenswrapper[13046]: I0308 03:38:04.001358 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:04.179504 master-0 kubenswrapper[13046]: I0308 03:38:04.179431 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerStarted","Data":"a35e64bc4f23c62a2d0c0072d1e833d761f72f161494dae611f8f6a5bcc32b25"} Mar 08 03:38:04.187565 master-0 kubenswrapper[13046]: I0308 03:38:04.187509 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerStarted","Data":"fc5e842357b2bf02e0e0875ca48bf772ff7886a805b08c2b7fd07775f88881b6"} Mar 08 03:38:04.190707 master-0 kubenswrapper[13046]: I0308 03:38:04.190614 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qv4vs" event={"ID":"cfb042fa-8f65-4e21-8d8e-d69c422e3cfe","Type":"ContainerDied","Data":"f35a05d83d76f6e69f3fa02ef2bbac803a8038ac5e36963b0ab1cbd58658e0e5"} Mar 08 03:38:04.190707 master-0 kubenswrapper[13046]: I0308 03:38:04.190704 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f35a05d83d76f6e69f3fa02ef2bbac803a8038ac5e36963b0ab1cbd58658e0e5" Mar 08 03:38:04.190830 master-0 kubenswrapper[13046]: I0308 03:38:04.190790 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qv4vs" Mar 08 03:38:04.399296 master-0 kubenswrapper[13046]: I0308 03:38:04.399206 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-internal-api-0" podStartSLOduration=7.399182907 podStartE2EDuration="7.399182907s" podCreationTimestamp="2026-03-08 03:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:04.387801714 +0000 UTC m=+1486.466568931" watchObservedRunningTime="2026-03-08 03:38:04.399182907 +0000 UTC m=+1486.477950144" Mar 08 03:38:04.592041 master-0 kubenswrapper[13046]: I0308 03:38:04.591956 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qv4vs"] Mar 08 03:38:04.698680 master-0 kubenswrapper[13046]: I0308 03:38:04.694831 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qv4vs"] Mar 08 03:38:04.856095 master-0 kubenswrapper[13046]: I0308 03:38:04.855971 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7fl4v"] Mar 08 03:38:04.856814 master-0 kubenswrapper[13046]: E0308 03:38:04.856572 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" containerName="keystone-bootstrap" Mar 08 03:38:04.856814 master-0 kubenswrapper[13046]: I0308 03:38:04.856601 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" containerName="keystone-bootstrap" Mar 08 03:38:04.857062 master-0 kubenswrapper[13046]: I0308 03:38:04.857029 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" containerName="keystone-bootstrap" Mar 08 03:38:04.857908 master-0 kubenswrapper[13046]: I0308 03:38:04.857878 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.862737 master-0 kubenswrapper[13046]: I0308 03:38:04.862351 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 03:38:04.862737 master-0 kubenswrapper[13046]: I0308 03:38:04.862458 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 03:38:04.862737 master-0 kubenswrapper[13046]: I0308 03:38:04.862641 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 03:38:04.862915 master-0 kubenswrapper[13046]: I0308 03:38:04.862784 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 03:38:04.910049 master-0 kubenswrapper[13046]: I0308 03:38:04.909987 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7fl4v"] Mar 08 03:38:04.966388 master-0 kubenswrapper[13046]: I0308 03:38:04.966336 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.966388 master-0 kubenswrapper[13046]: I0308 03:38:04.966389 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.966662 master-0 kubenswrapper[13046]: I0308 03:38:04.966627 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.966993 master-0 kubenswrapper[13046]: I0308 03:38:04.966966 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.967042 master-0 kubenswrapper[13046]: I0308 03:38:04.967016 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvc8s\" (UniqueName: \"kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:04.967323 master-0 kubenswrapper[13046]: I0308 03:38:04.967296 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069163 master-0 kubenswrapper[13046]: I0308 03:38:05.069091 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069374 master-0 kubenswrapper[13046]: I0308 03:38:05.069173 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvc8s\" (UniqueName: \"kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069374 master-0 kubenswrapper[13046]: I0308 03:38:05.069306 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069374 master-0 kubenswrapper[13046]: I0308 03:38:05.069356 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069472 master-0 kubenswrapper[13046]: I0308 03:38:05.069397 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.069572 master-0 kubenswrapper[13046]: I0308 03:38:05.069548 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.073110 master-0 kubenswrapper[13046]: I0308 03:38:05.073073 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.074931 master-0 kubenswrapper[13046]: I0308 03:38:05.074899 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.075288 master-0 kubenswrapper[13046]: I0308 03:38:05.075259 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.076472 master-0 kubenswrapper[13046]: I0308 03:38:05.075899 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.076472 master-0 kubenswrapper[13046]: I0308 03:38:05.076431 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.213320 master-0 kubenswrapper[13046]: I0308 03:38:05.213178 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerStarted","Data":"7341ca39f54fd259cfc0f07022cf8924a4503d51d569ec75af422f6a149ce7e7"} Mar 08 03:38:05.213888 master-0 kubenswrapper[13046]: I0308 03:38:05.213349 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-internal-api-0" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-log" containerID="cri-o://c73e925b670a49236a550646f1834b2e0477a04dc5a0290f4b497795cd29cf3f" gracePeriod=30 Mar 08 03:38:05.214105 master-0 kubenswrapper[13046]: I0308 03:38:05.213944 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-external-api-0" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-log" containerID="cri-o://fc5e842357b2bf02e0e0875ca48bf772ff7886a805b08c2b7fd07775f88881b6" gracePeriod=30 Mar 08 03:38:05.214105 master-0 kubenswrapper[13046]: I0308 03:38:05.213949 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-internal-api-0" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-httpd" containerID="cri-o://a35e64bc4f23c62a2d0c0072d1e833d761f72f161494dae611f8f6a5bcc32b25" gracePeriod=30 Mar 08 03:38:05.214683 master-0 kubenswrapper[13046]: I0308 03:38:05.214129 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-external-api-0" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-httpd" containerID="cri-o://7341ca39f54fd259cfc0f07022cf8924a4503d51d569ec75af422f6a149ce7e7" gracePeriod=30 Mar 08 03:38:05.254713 master-0 kubenswrapper[13046]: I0308 03:38:05.254661 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvc8s\" (UniqueName: \"kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s\") pod \"keystone-bootstrap-7fl4v\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.281136 master-0 kubenswrapper[13046]: I0308 03:38:05.281055 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-external-api-0" podStartSLOduration=5.281036125 podStartE2EDuration="5.281036125s" podCreationTimestamp="2026-03-08 03:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:05.264475804 +0000 UTC m=+1487.343243041" watchObservedRunningTime="2026-03-08 03:38:05.281036125 +0000 UTC m=+1487.359803342" Mar 08 03:38:05.385508 master-0 kubenswrapper[13046]: I0308 03:38:05.383280 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-j4bdp"] Mar 08 03:38:05.385726 master-0 kubenswrapper[13046]: I0308 03:38:05.385627 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.391375 master-0 kubenswrapper[13046]: I0308 03:38:05.390574 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 03:38:05.391375 master-0 kubenswrapper[13046]: I0308 03:38:05.390632 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 08 03:38:05.413639 master-0 kubenswrapper[13046]: I0308 03:38:05.398858 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-j4bdp"] Mar 08 03:38:05.450679 master-0 kubenswrapper[13046]: I0308 03:38:05.450629 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479490 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvgg\" (UniqueName: \"kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479553 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479581 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479616 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479671 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.480171 master-0 kubenswrapper[13046]: I0308 03:38:05.479703 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.488392 master-0 kubenswrapper[13046]: I0308 03:38:05.487985 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:05.544814 master-0 kubenswrapper[13046]: I0308 03:38:05.544757 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:38:05.545332 master-0 kubenswrapper[13046]: I0308 03:38:05.545074 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="dnsmasq-dns" containerID="cri-o://c2ff03ed8f6c4c79eafb130fac7fbb943bb63acc7415c283d5142fba3d3e695f" gracePeriod=10 Mar 08 03:38:05.582056 master-0 kubenswrapper[13046]: I0308 03:38:05.581925 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvgg\" (UniqueName: \"kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.582056 master-0 kubenswrapper[13046]: I0308 03:38:05.581989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.582056 master-0 kubenswrapper[13046]: I0308 03:38:05.582019 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.582056 master-0 kubenswrapper[13046]: I0308 03:38:05.582056 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.582355 master-0 kubenswrapper[13046]: I0308 03:38:05.582134 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.582355 master-0 kubenswrapper[13046]: I0308 03:38:05.582185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.595388 master-0 kubenswrapper[13046]: I0308 03:38:05.595335 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.596008 master-0 kubenswrapper[13046]: I0308 03:38:05.595958 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.603050 master-0 kubenswrapper[13046]: I0308 03:38:05.602995 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.604395 master-0 kubenswrapper[13046]: I0308 03:38:05.604255 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.607492 master-0 kubenswrapper[13046]: I0308 03:38:05.607448 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvgg\" (UniqueName: \"kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.624529 master-0 kubenswrapper[13046]: I0308 03:38:05.621081 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts\") pod \"ironic-db-sync-j4bdp\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:05.638501 master-0 kubenswrapper[13046]: I0308 03:38:05.638351 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:06.133668 master-0 kubenswrapper[13046]: I0308 03:38:06.133612 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb042fa-8f65-4e21-8d8e-d69c422e3cfe" path="/var/lib/kubelet/pods/cfb042fa-8f65-4e21-8d8e-d69c422e3cfe/volumes" Mar 08 03:38:06.245439 master-0 kubenswrapper[13046]: I0308 03:38:06.245335 13046 generic.go:334] "Generic (PLEG): container finished" podID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerID="a35e64bc4f23c62a2d0c0072d1e833d761f72f161494dae611f8f6a5bcc32b25" exitCode=0 Mar 08 03:38:06.245439 master-0 kubenswrapper[13046]: I0308 03:38:06.245369 13046 generic.go:334] "Generic (PLEG): container finished" podID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerID="c73e925b670a49236a550646f1834b2e0477a04dc5a0290f4b497795cd29cf3f" exitCode=143 Mar 08 03:38:06.245439 master-0 kubenswrapper[13046]: I0308 03:38:06.245419 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerDied","Data":"a35e64bc4f23c62a2d0c0072d1e833d761f72f161494dae611f8f6a5bcc32b25"} Mar 08 03:38:06.246031 master-0 kubenswrapper[13046]: I0308 03:38:06.245474 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerDied","Data":"c73e925b670a49236a550646f1834b2e0477a04dc5a0290f4b497795cd29cf3f"} Mar 08 03:38:06.248637 master-0 kubenswrapper[13046]: I0308 03:38:06.248600 13046 generic.go:334] "Generic (PLEG): container finished" podID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerID="c2ff03ed8f6c4c79eafb130fac7fbb943bb63acc7415c283d5142fba3d3e695f" exitCode=0 Mar 08 03:38:06.248699 master-0 kubenswrapper[13046]: I0308 03:38:06.248685 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" event={"ID":"6312c2ac-32ce-4040-91dd-3c0193d10918","Type":"ContainerDied","Data":"c2ff03ed8f6c4c79eafb130fac7fbb943bb63acc7415c283d5142fba3d3e695f"} Mar 08 03:38:06.259507 master-0 kubenswrapper[13046]: I0308 03:38:06.252474 13046 generic.go:334] "Generic (PLEG): container finished" podID="3172792f-8e97-4624-a81c-c165f8068600" containerID="7341ca39f54fd259cfc0f07022cf8924a4503d51d569ec75af422f6a149ce7e7" exitCode=0 Mar 08 03:38:06.259507 master-0 kubenswrapper[13046]: I0308 03:38:06.252531 13046 generic.go:334] "Generic (PLEG): container finished" podID="3172792f-8e97-4624-a81c-c165f8068600" containerID="fc5e842357b2bf02e0e0875ca48bf772ff7886a805b08c2b7fd07775f88881b6" exitCode=143 Mar 08 03:38:06.259507 master-0 kubenswrapper[13046]: I0308 03:38:06.252553 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerDied","Data":"7341ca39f54fd259cfc0f07022cf8924a4503d51d569ec75af422f6a149ce7e7"} Mar 08 03:38:06.259507 master-0 kubenswrapper[13046]: I0308 03:38:06.253211 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerDied","Data":"fc5e842357b2bf02e0e0875ca48bf772ff7886a805b08c2b7fd07775f88881b6"} Mar 08 03:38:06.388553 master-0 kubenswrapper[13046]: I0308 03:38:06.371206 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:06.455107 master-0 kubenswrapper[13046]: I0308 03:38:06.454477 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.455107 master-0 kubenswrapper[13046]: I0308 03:38:06.454761 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.455107 master-0 kubenswrapper[13046]: I0308 03:38:06.454843 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.455107 master-0 kubenswrapper[13046]: I0308 03:38:06.454918 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.455107 master-0 kubenswrapper[13046]: I0308 03:38:06.454959 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.457622 master-0 kubenswrapper[13046]: I0308 03:38:06.455119 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kddb\" (UniqueName: \"kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.457622 master-0 kubenswrapper[13046]: I0308 03:38:06.455157 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle\") pod \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\" (UID: \"de2c515d-f93d-4089-b2fb-73b3b6f185a2\") " Mar 08 03:38:06.457622 master-0 kubenswrapper[13046]: I0308 03:38:06.456802 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs" (OuterVolumeSpecName: "logs") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:06.457622 master-0 kubenswrapper[13046]: I0308 03:38:06.457001 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:06.462536 master-0 kubenswrapper[13046]: I0308 03:38:06.462334 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts" (OuterVolumeSpecName: "scripts") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.462536 master-0 kubenswrapper[13046]: I0308 03:38:06.462469 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb" (OuterVolumeSpecName: "kube-api-access-2kddb") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "kube-api-access-2kddb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:06.489016 master-0 kubenswrapper[13046]: I0308 03:38:06.488014 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.493176 master-0 kubenswrapper[13046]: I0308 03:38:06.493133 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0" (OuterVolumeSpecName: "glance") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 03:38:06.517229 master-0 kubenswrapper[13046]: I0308 03:38:06.517173 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data" (OuterVolumeSpecName: "config-data") pod "de2c515d-f93d-4089-b2fb-73b3b6f185a2" (UID: "de2c515d-f93d-4089-b2fb-73b3b6f185a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558698 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558742 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558755 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558765 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kddb\" (UniqueName: \"kubernetes.io/projected/de2c515d-f93d-4089-b2fb-73b3b6f185a2-kube-api-access-2kddb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558776 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2c515d-f93d-4089-b2fb-73b3b6f185a2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558785 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de2c515d-f93d-4089-b2fb-73b3b6f185a2-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.564406 master-0 kubenswrapper[13046]: I0308 03:38:06.558811 13046 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") on node \"master-0\" " Mar 08 03:38:06.589036 master-0 kubenswrapper[13046]: I0308 03:38:06.588869 13046 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 03:38:06.589206 master-0 kubenswrapper[13046]: I0308 03:38:06.589060 13046 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c" (UniqueName: "kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0") on node "master-0" Mar 08 03:38:06.606416 master-0 kubenswrapper[13046]: I0308 03:38:06.606282 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7fl4v"] Mar 08 03:38:06.614157 master-0 kubenswrapper[13046]: W0308 03:38:06.614108 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4248bb04_5f13_4afc_9263_49f3c929cd50.slice/crio-580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb WatchSource:0}: Error finding container 580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb: Status 404 returned error can't find the container with id 580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb Mar 08 03:38:06.614945 master-0 kubenswrapper[13046]: I0308 03:38:06.614906 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:06.621191 master-0 kubenswrapper[13046]: W0308 03:38:06.619211 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcedf488b_a7e0_4a91_a6e8_d4cd25e33df6.slice/crio-4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a WatchSource:0}: Error finding container 4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a: Status 404 returned error can't find the container with id 4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a Mar 08 03:38:06.641702 master-0 kubenswrapper[13046]: I0308 03:38:06.641603 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:38:06.645142 master-0 kubenswrapper[13046]: I0308 03:38:06.645066 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-j4bdp"] Mar 08 03:38:06.661796 master-0 kubenswrapper[13046]: I0308 03:38:06.661739 13046 reconciler_common.go:293] "Volume detached for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.762635 master-0 kubenswrapper[13046]: I0308 03:38:06.762583 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.762635 master-0 kubenswrapper[13046]: I0308 03:38:06.762638 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.762665 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.762746 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6zz\" (UniqueName: \"kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.762789 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.762891 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.762970 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763096 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763160 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq7tj\" (UniqueName: \"kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763207 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763228 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763344 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"3172792f-8e97-4624-a81c-c165f8068600\" (UID: \"3172792f-8e97-4624-a81c-c165f8068600\") " Mar 08 03:38:06.763837 master-0 kubenswrapper[13046]: I0308 03:38:06.763390 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb\") pod \"6312c2ac-32ce-4040-91dd-3c0193d10918\" (UID: \"6312c2ac-32ce-4040-91dd-3c0193d10918\") " Mar 08 03:38:06.766721 master-0 kubenswrapper[13046]: I0308 03:38:06.766659 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs" (OuterVolumeSpecName: "logs") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:06.768785 master-0 kubenswrapper[13046]: I0308 03:38:06.768743 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:06.778138 master-0 kubenswrapper[13046]: I0308 03:38:06.778051 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj" (OuterVolumeSpecName: "kube-api-access-bq7tj") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "kube-api-access-bq7tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:06.791704 master-0 kubenswrapper[13046]: I0308 03:38:06.791642 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts" (OuterVolumeSpecName: "scripts") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.797228 master-0 kubenswrapper[13046]: I0308 03:38:06.797100 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz" (OuterVolumeSpecName: "kube-api-access-xm6zz") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "kube-api-access-xm6zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:06.807657 master-0 kubenswrapper[13046]: I0308 03:38:06.807113 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82" (OuterVolumeSpecName: "glance") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 03:38:06.848020 master-0 kubenswrapper[13046]: I0308 03:38:06.847969 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.857984 master-0 kubenswrapper[13046]: I0308 03:38:06.857941 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:06.859074 master-0 kubenswrapper[13046]: I0308 03:38:06.859025 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:06.859471 master-0 kubenswrapper[13046]: I0308 03:38:06.859436 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data" (OuterVolumeSpecName: "config-data") pod "3172792f-8e97-4624-a81c-c165f8068600" (UID: "3172792f-8e97-4624-a81c-c165f8068600"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:06.863043 master-0 kubenswrapper[13046]: I0308 03:38:06.862937 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:06.864585 master-0 kubenswrapper[13046]: I0308 03:38:06.864534 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config" (OuterVolumeSpecName: "config") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:06.865873 master-0 kubenswrapper[13046]: I0308 03:38:06.865832 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.865873 master-0 kubenswrapper[13046]: I0308 03:38:06.865852 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.865873 master-0 kubenswrapper[13046]: I0308 03:38:06.865862 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.865873 master-0 kubenswrapper[13046]: I0308 03:38:06.865873 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865883 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq7tj\" (UniqueName: \"kubernetes.io/projected/3172792f-8e97-4624-a81c-c165f8068600-kube-api-access-bq7tj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865907 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3172792f-8e97-4624-a81c-c165f8068600-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865935 13046 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" " Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865944 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865954 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865962 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3172792f-8e97-4624-a81c-c165f8068600-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865970 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.866082 master-0 kubenswrapper[13046]: I0308 03:38:06.865978 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6zz\" (UniqueName: \"kubernetes.io/projected/6312c2ac-32ce-4040-91dd-3c0193d10918-kube-api-access-xm6zz\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.881358 master-0 kubenswrapper[13046]: I0308 03:38:06.881310 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6312c2ac-32ce-4040-91dd-3c0193d10918" (UID: "6312c2ac-32ce-4040-91dd-3c0193d10918"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:06.892782 master-0 kubenswrapper[13046]: I0308 03:38:06.892195 13046 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 03:38:06.898782 master-0 kubenswrapper[13046]: I0308 03:38:06.898741 13046 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd" (UniqueName: "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82") on node "master-0" Mar 08 03:38:06.968859 master-0 kubenswrapper[13046]: I0308 03:38:06.968817 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6312c2ac-32ce-4040-91dd-3c0193d10918-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:06.968859 master-0 kubenswrapper[13046]: I0308 03:38:06.968854 13046 reconciler_common.go:293] "Volume detached for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:07.267543 master-0 kubenswrapper[13046]: I0308 03:38:07.267424 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7fl4v" event={"ID":"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6","Type":"ContainerStarted","Data":"91498d4b392723401762fcc25df5e4008223bb59c3a5018aa4b5b71e88aba15a"} Mar 08 03:38:07.267543 master-0 kubenswrapper[13046]: I0308 03:38:07.267476 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7fl4v" event={"ID":"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6","Type":"ContainerStarted","Data":"4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a"} Mar 08 03:38:07.271968 master-0 kubenswrapper[13046]: I0308 03:38:07.271923 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3172792f-8e97-4624-a81c-c165f8068600","Type":"ContainerDied","Data":"854cba4180d2afe3d0770e4934208ffcfe57fcb19ff9390fd25279aadcd1523b"} Mar 08 03:38:07.271968 master-0 kubenswrapper[13046]: I0308 03:38:07.271952 13046 scope.go:117] "RemoveContainer" containerID="7341ca39f54fd259cfc0f07022cf8924a4503d51d569ec75af422f6a149ce7e7" Mar 08 03:38:07.272158 master-0 kubenswrapper[13046]: I0308 03:38:07.272120 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.274298 master-0 kubenswrapper[13046]: I0308 03:38:07.274258 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"de2c515d-f93d-4089-b2fb-73b3b6f185a2","Type":"ContainerDied","Data":"a858379d964d6a8d4447050f270f02e72632c117d2b426eb64a2497cf21a8c5c"} Mar 08 03:38:07.274651 master-0 kubenswrapper[13046]: I0308 03:38:07.274297 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.282861 master-0 kubenswrapper[13046]: I0308 03:38:07.282782 13046 generic.go:334] "Generic (PLEG): container finished" podID="2f71b2d7-115f-473a-9427-8af24a1a7467" containerID="5f2feb18084b6b8394640bdd39cec82ea0afd0d4e5afeea24398dbf8c76cceb1" exitCode=0 Mar 08 03:38:07.283053 master-0 kubenswrapper[13046]: I0308 03:38:07.282932 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzj84" event={"ID":"2f71b2d7-115f-473a-9427-8af24a1a7467","Type":"ContainerDied","Data":"5f2feb18084b6b8394640bdd39cec82ea0afd0d4e5afeea24398dbf8c76cceb1"} Mar 08 03:38:07.291262 master-0 kubenswrapper[13046]: I0308 03:38:07.289752 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7fl4v" podStartSLOduration=3.289724691 podStartE2EDuration="3.289724691s" podCreationTimestamp="2026-03-08 03:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:07.283744781 +0000 UTC m=+1489.362511988" watchObservedRunningTime="2026-03-08 03:38:07.289724691 +0000 UTC m=+1489.368491908" Mar 08 03:38:07.303714 master-0 kubenswrapper[13046]: I0308 03:38:07.303674 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" event={"ID":"6312c2ac-32ce-4040-91dd-3c0193d10918","Type":"ContainerDied","Data":"f4e4f0f73b310e3496256add05e6ec7e9b49c0db7aa120fb27ae61785008faea"} Mar 08 03:38:07.303928 master-0 kubenswrapper[13046]: I0308 03:38:07.303880 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8f6c88c7-62p4t" Mar 08 03:38:07.312747 master-0 kubenswrapper[13046]: I0308 03:38:07.312686 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerStarted","Data":"580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb"} Mar 08 03:38:07.333014 master-0 kubenswrapper[13046]: I0308 03:38:07.332978 13046 scope.go:117] "RemoveContainer" containerID="fc5e842357b2bf02e0e0875ca48bf772ff7886a805b08c2b7fd07775f88881b6" Mar 08 03:38:07.366646 master-0 kubenswrapper[13046]: I0308 03:38:07.365954 13046 scope.go:117] "RemoveContainer" containerID="a35e64bc4f23c62a2d0c0072d1e833d761f72f161494dae611f8f6a5bcc32b25" Mar 08 03:38:07.367864 master-0 kubenswrapper[13046]: I0308 03:38:07.366764 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:38:07.384659 master-0 kubenswrapper[13046]: I0308 03:38:07.384599 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8f6c88c7-62p4t"] Mar 08 03:38:07.394359 master-0 kubenswrapper[13046]: I0308 03:38:07.394299 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:07.409424 master-0 kubenswrapper[13046]: I0308 03:38:07.406575 13046 scope.go:117] "RemoveContainer" containerID="c73e925b670a49236a550646f1834b2e0477a04dc5a0290f4b497795cd29cf3f" Mar 08 03:38:07.414679 master-0 kubenswrapper[13046]: I0308 03:38:07.414618 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:07.450553 master-0 kubenswrapper[13046]: I0308 03:38:07.447771 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.466824 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467585 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467601 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467612 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467618 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467629 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="init" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467636 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="init" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467682 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467689 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467708 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="dnsmasq-dns" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467714 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="dnsmasq-dns" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: E0308 03:38:07.467737 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.467743 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.468174 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.468206 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-log" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.468267 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.468281 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" containerName="dnsmasq-dns" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.468297 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3172792f-8e97-4624-a81c-c165f8068600" containerName="glance-httpd" Mar 08 03:38:07.474131 master-0 kubenswrapper[13046]: I0308 03:38:07.470190 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.475672 master-0 kubenswrapper[13046]: I0308 03:38:07.475626 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 03:38:07.478841 master-0 kubenswrapper[13046]: I0308 03:38:07.478801 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 03:38:07.480651 master-0 kubenswrapper[13046]: I0308 03:38:07.479176 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-external-config-data" Mar 08 03:38:07.519601 master-0 kubenswrapper[13046]: I0308 03:38:07.498660 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:07.519601 master-0 kubenswrapper[13046]: I0308 03:38:07.508843 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:07.519954 master-0 kubenswrapper[13046]: I0308 03:38:07.519606 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:07.522317 master-0 kubenswrapper[13046]: I0308 03:38:07.521764 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.530091 master-0 kubenswrapper[13046]: I0308 03:38:07.527046 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 03:38:07.530091 master-0 kubenswrapper[13046]: I0308 03:38:07.527084 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-internal-config-data" Mar 08 03:38:07.539302 master-0 kubenswrapper[13046]: I0308 03:38:07.537312 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581086 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581170 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581266 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581306 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581347 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86c7\" (UniqueName: \"kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581374 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581417 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.581515 master-0 kubenswrapper[13046]: I0308 03:38:07.581460 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683170 master-0 kubenswrapper[13046]: I0308 03:38:07.682846 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683170 master-0 kubenswrapper[13046]: I0308 03:38:07.683152 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683214 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683254 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683275 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683298 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683327 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683344 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft526\" (UniqueName: \"kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683389 master-0 kubenswrapper[13046]: I0308 03:38:07.683366 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683406 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683468 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683527 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683565 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683602 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86c7\" (UniqueName: \"kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.683715 master-0 kubenswrapper[13046]: I0308 03:38:07.683620 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.684633 master-0 kubenswrapper[13046]: I0308 03:38:07.684202 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.684667 master-0 kubenswrapper[13046]: I0308 03:38:07.684630 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.688291 master-0 kubenswrapper[13046]: I0308 03:38:07.688226 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:38:07.688291 master-0 kubenswrapper[13046]: I0308 03:38:07.688254 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7ee56d67d41b92c054773f64bf7346771894ec4c4c18aa0117ea2564b4c6d4a8/globalmount\"" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.688811 master-0 kubenswrapper[13046]: I0308 03:38:07.688771 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.689142 master-0 kubenswrapper[13046]: I0308 03:38:07.689017 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.691291 master-0 kubenswrapper[13046]: I0308 03:38:07.691249 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.704516 master-0 kubenswrapper[13046]: I0308 03:38:07.704340 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.707113 master-0 kubenswrapper[13046]: I0308 03:38:07.706887 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86c7\" (UniqueName: \"kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:07.785247 master-0 kubenswrapper[13046]: I0308 03:38:07.785118 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785247 master-0 kubenswrapper[13046]: I0308 03:38:07.785184 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785247 master-0 kubenswrapper[13046]: I0308 03:38:07.785216 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785548 master-0 kubenswrapper[13046]: I0308 03:38:07.785259 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785548 master-0 kubenswrapper[13046]: I0308 03:38:07.785296 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft526\" (UniqueName: \"kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785548 master-0 kubenswrapper[13046]: I0308 03:38:07.785317 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785548 master-0 kubenswrapper[13046]: I0308 03:38:07.785351 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785548 master-0 kubenswrapper[13046]: I0308 03:38:07.785439 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.785956 master-0 kubenswrapper[13046]: I0308 03:38:07.785933 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.786628 master-0 kubenswrapper[13046]: I0308 03:38:07.786606 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.789978 master-0 kubenswrapper[13046]: I0308 03:38:07.789955 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.790851 master-0 kubenswrapper[13046]: I0308 03:38:07.790827 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:38:07.790951 master-0 kubenswrapper[13046]: I0308 03:38:07.790854 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/46958aa2ee719eeb05ed2819dd5b5bd381312e73d29f62f1a59eb590a4eaa799/globalmount\"" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.792246 master-0 kubenswrapper[13046]: I0308 03:38:07.792213 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.798894 master-0 kubenswrapper[13046]: I0308 03:38:07.798859 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.801680 master-0 kubenswrapper[13046]: I0308 03:38:07.801647 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:07.804971 master-0 kubenswrapper[13046]: I0308 03:38:07.804918 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft526\" (UniqueName: \"kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:08.138582 master-0 kubenswrapper[13046]: I0308 03:38:08.137145 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3172792f-8e97-4624-a81c-c165f8068600" path="/var/lib/kubelet/pods/3172792f-8e97-4624-a81c-c165f8068600/volumes" Mar 08 03:38:08.140551 master-0 kubenswrapper[13046]: I0308 03:38:08.140253 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6312c2ac-32ce-4040-91dd-3c0193d10918" path="/var/lib/kubelet/pods/6312c2ac-32ce-4040-91dd-3c0193d10918/volumes" Mar 08 03:38:08.141029 master-0 kubenswrapper[13046]: I0308 03:38:08.141004 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2c515d-f93d-4089-b2fb-73b3b6f185a2" path="/var/lib/kubelet/pods/de2c515d-f93d-4089-b2fb-73b3b6f185a2/volumes" Mar 08 03:38:09.030089 master-0 kubenswrapper[13046]: I0308 03:38:09.030043 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:09.312942 master-0 kubenswrapper[13046]: I0308 03:38:09.312781 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:10.538145 master-0 kubenswrapper[13046]: I0308 03:38:10.538039 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:10.841670 master-0 kubenswrapper[13046]: I0308 03:38:10.841617 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:17.186077 master-0 kubenswrapper[13046]: I0308 03:38:17.185589 13046 scope.go:117] "RemoveContainer" containerID="c2ff03ed8f6c4c79eafb130fac7fbb943bb63acc7415c283d5142fba3d3e695f" Mar 08 03:38:17.438787 master-0 kubenswrapper[13046]: I0308 03:38:17.438659 13046 generic.go:334] "Generic (PLEG): container finished" podID="cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" containerID="91498d4b392723401762fcc25df5e4008223bb59c3a5018aa4b5b71e88aba15a" exitCode=0 Mar 08 03:38:17.438787 master-0 kubenswrapper[13046]: I0308 03:38:17.438738 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7fl4v" event={"ID":"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6","Type":"ContainerDied","Data":"91498d4b392723401762fcc25df5e4008223bb59c3a5018aa4b5b71e88aba15a"} Mar 08 03:38:18.719609 master-0 kubenswrapper[13046]: E0308 03:38:18.719512 13046 info.go:109] Failed to get network devices: open /sys/class/net/4f24c522c1ac4c2/address: no such file or directory Mar 08 03:38:19.462281 master-0 kubenswrapper[13046]: I0308 03:38:19.460899 13046 scope.go:117] "RemoveContainer" containerID="cf9b9b6be338f8d9f4e83cf459ebda8704f03aa9b7cba1862adc58ad1b5d94f3" Mar 08 03:38:19.493028 master-0 kubenswrapper[13046]: I0308 03:38:19.492885 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dzj84" event={"ID":"2f71b2d7-115f-473a-9427-8af24a1a7467","Type":"ContainerDied","Data":"ba6576b1c78c76016bf5c2d6723276cd4aef7235fbe486054e12e0390c9b0b66"} Mar 08 03:38:19.493846 master-0 kubenswrapper[13046]: I0308 03:38:19.493819 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6576b1c78c76016bf5c2d6723276cd4aef7235fbe486054e12e0390c9b0b66" Mar 08 03:38:19.521778 master-0 kubenswrapper[13046]: I0308 03:38:19.521467 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzj84" Mar 08 03:38:19.694882 master-0 kubenswrapper[13046]: I0308 03:38:19.694830 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvcpw\" (UniqueName: \"kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw\") pod \"2f71b2d7-115f-473a-9427-8af24a1a7467\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " Mar 08 03:38:19.695116 master-0 kubenswrapper[13046]: I0308 03:38:19.694918 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data\") pod \"2f71b2d7-115f-473a-9427-8af24a1a7467\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " Mar 08 03:38:19.695116 master-0 kubenswrapper[13046]: I0308 03:38:19.694951 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle\") pod \"2f71b2d7-115f-473a-9427-8af24a1a7467\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " Mar 08 03:38:19.695441 master-0 kubenswrapper[13046]: I0308 03:38:19.695054 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts\") pod \"2f71b2d7-115f-473a-9427-8af24a1a7467\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " Mar 08 03:38:19.695633 master-0 kubenswrapper[13046]: I0308 03:38:19.695616 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs\") pod \"2f71b2d7-115f-473a-9427-8af24a1a7467\" (UID: \"2f71b2d7-115f-473a-9427-8af24a1a7467\") " Mar 08 03:38:19.696067 master-0 kubenswrapper[13046]: I0308 03:38:19.696025 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs" (OuterVolumeSpecName: "logs") pod "2f71b2d7-115f-473a-9427-8af24a1a7467" (UID: "2f71b2d7-115f-473a-9427-8af24a1a7467"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:19.697059 master-0 kubenswrapper[13046]: I0308 03:38:19.697032 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f71b2d7-115f-473a-9427-8af24a1a7467-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:19.698095 master-0 kubenswrapper[13046]: I0308 03:38:19.698061 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw" (OuterVolumeSpecName: "kube-api-access-bvcpw") pod "2f71b2d7-115f-473a-9427-8af24a1a7467" (UID: "2f71b2d7-115f-473a-9427-8af24a1a7467"). InnerVolumeSpecName "kube-api-access-bvcpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:19.699081 master-0 kubenswrapper[13046]: I0308 03:38:19.699037 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts" (OuterVolumeSpecName: "scripts") pod "2f71b2d7-115f-473a-9427-8af24a1a7467" (UID: "2f71b2d7-115f-473a-9427-8af24a1a7467"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:19.724823 master-0 kubenswrapper[13046]: I0308 03:38:19.724775 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f71b2d7-115f-473a-9427-8af24a1a7467" (UID: "2f71b2d7-115f-473a-9427-8af24a1a7467"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:19.738363 master-0 kubenswrapper[13046]: I0308 03:38:19.738318 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data" (OuterVolumeSpecName: "config-data") pod "2f71b2d7-115f-473a-9427-8af24a1a7467" (UID: "2f71b2d7-115f-473a-9427-8af24a1a7467"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:19.798923 master-0 kubenswrapper[13046]: I0308 03:38:19.798879 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvcpw\" (UniqueName: \"kubernetes.io/projected/2f71b2d7-115f-473a-9427-8af24a1a7467-kube-api-access-bvcpw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:19.798923 master-0 kubenswrapper[13046]: I0308 03:38:19.798923 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:19.799250 master-0 kubenswrapper[13046]: I0308 03:38:19.798935 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:19.799250 master-0 kubenswrapper[13046]: I0308 03:38:19.798945 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f71b2d7-115f-473a-9427-8af24a1a7467-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:20.511746 master-0 kubenswrapper[13046]: I0308 03:38:20.511552 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dzj84" Mar 08 03:38:21.016088 master-0 kubenswrapper[13046]: I0308 03:38:21.016011 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:38:21.019589 master-0 kubenswrapper[13046]: E0308 03:38:21.016564 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f71b2d7-115f-473a-9427-8af24a1a7467" containerName="placement-db-sync" Mar 08 03:38:21.019589 master-0 kubenswrapper[13046]: I0308 03:38:21.016580 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f71b2d7-115f-473a-9427-8af24a1a7467" containerName="placement-db-sync" Mar 08 03:38:21.019589 master-0 kubenswrapper[13046]: I0308 03:38:21.017664 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f71b2d7-115f-473a-9427-8af24a1a7467" containerName="placement-db-sync" Mar 08 03:38:21.019589 master-0 kubenswrapper[13046]: I0308 03:38:21.018790 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.022952 master-0 kubenswrapper[13046]: I0308 03:38:21.022898 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 03:38:21.023159 master-0 kubenswrapper[13046]: I0308 03:38:21.023095 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 03:38:21.023279 master-0 kubenswrapper[13046]: I0308 03:38:21.023251 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 03:38:21.023359 master-0 kubenswrapper[13046]: I0308 03:38:21.023255 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 03:38:21.066513 master-0 kubenswrapper[13046]: I0308 03:38:21.066433 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126319 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126468 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9f4\" (UniqueName: \"kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126516 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126566 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126606 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126630 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.126974 master-0 kubenswrapper[13046]: I0308 03:38:21.126653 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.138664 master-0 kubenswrapper[13046]: I0308 03:38:21.137884 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:38:21.228567 master-0 kubenswrapper[13046]: I0308 03:38:21.228516 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.228835 master-0 kubenswrapper[13046]: I0308 03:38:21.228803 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.228878 master-0 kubenswrapper[13046]: I0308 03:38:21.228843 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvc8s\" (UniqueName: \"kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.228914 master-0 kubenswrapper[13046]: I0308 03:38:21.228880 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.228914 master-0 kubenswrapper[13046]: I0308 03:38:21.228910 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.228979 master-0 kubenswrapper[13046]: I0308 03:38:21.228958 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys\") pod \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\" (UID: \"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6\") " Mar 08 03:38:21.229908 master-0 kubenswrapper[13046]: I0308 03:38:21.229884 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.230054 master-0 kubenswrapper[13046]: I0308 03:38:21.230034 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.230219 master-0 kubenswrapper[13046]: I0308 03:38:21.230200 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.230359 master-0 kubenswrapper[13046]: I0308 03:38:21.230294 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.230607 master-0 kubenswrapper[13046]: I0308 03:38:21.230535 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.230992 master-0 kubenswrapper[13046]: I0308 03:38:21.230965 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9f4\" (UniqueName: \"kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.231046 master-0 kubenswrapper[13046]: I0308 03:38:21.231028 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.231792 master-0 kubenswrapper[13046]: I0308 03:38:21.231738 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.233904 master-0 kubenswrapper[13046]: I0308 03:38:21.233865 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.234637 master-0 kubenswrapper[13046]: I0308 03:38:21.234608 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:21.235537 master-0 kubenswrapper[13046]: I0308 03:38:21.235500 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts" (OuterVolumeSpecName: "scripts") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:21.235658 master-0 kubenswrapper[13046]: I0308 03:38:21.235621 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s" (OuterVolumeSpecName: "kube-api-access-dvc8s") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "kube-api-access-dvc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:21.236178 master-0 kubenswrapper[13046]: I0308 03:38:21.235861 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.236178 master-0 kubenswrapper[13046]: I0308 03:38:21.236133 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.236506 master-0 kubenswrapper[13046]: I0308 03:38:21.236424 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:21.237937 master-0 kubenswrapper[13046]: I0308 03:38:21.237904 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.241643 master-0 kubenswrapper[13046]: I0308 03:38:21.241583 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.259408 master-0 kubenswrapper[13046]: I0308 03:38:21.259350 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:21.275421 master-0 kubenswrapper[13046]: I0308 03:38:21.275355 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data" (OuterVolumeSpecName: "config-data") pod "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" (UID: "cedf488b-a7e0-4a91-a6e8-d4cd25e33df6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334276 13046 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334358 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334369 13046 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334379 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvc8s\" (UniqueName: \"kubernetes.io/projected/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-kube-api-access-dvc8s\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334389 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.334401 master-0 kubenswrapper[13046]: I0308 03:38:21.334397 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:21.539227 master-0 kubenswrapper[13046]: I0308 03:38:21.539160 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7fl4v" event={"ID":"cedf488b-a7e0-4a91-a6e8-d4cd25e33df6","Type":"ContainerDied","Data":"4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a"} Mar 08 03:38:21.539227 master-0 kubenswrapper[13046]: I0308 03:38:21.539205 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f24c522c1ac4c2d8ffd93074e9769602fc84091367ed80d558b9806c524647a" Mar 08 03:38:21.539468 master-0 kubenswrapper[13046]: I0308 03:38:21.539228 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7fl4v" Mar 08 03:38:21.697260 master-0 kubenswrapper[13046]: I0308 03:38:21.697209 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9f4\" (UniqueName: \"kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4\") pod \"placement-7bff74b894-bgwhf\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:21.982976 master-0 kubenswrapper[13046]: I0308 03:38:21.981361 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:22.303585 master-0 kubenswrapper[13046]: I0308 03:38:22.303522 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:38:22.590040 master-0 kubenswrapper[13046]: I0308 03:38:22.589978 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerStarted","Data":"df8d8a8dac7d8a5e29c321f37c2004c3b928c131bbee03ea5ffc898a7587c970"} Mar 08 03:38:22.594293 master-0 kubenswrapper[13046]: I0308 03:38:22.593794 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-db-sync-xz5js" event={"ID":"888c100e-3bc9-45fa-a5a2-fe687ee09c1c","Type":"ContainerStarted","Data":"a300fa2a572f7a256930ec382dbb5fb8085e21791741be6fe4b0f8b9cabbf9a4"} Mar 08 03:38:22.605211 master-0 kubenswrapper[13046]: I0308 03:38:22.604528 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerStarted","Data":"c3fb1d9c0b95ed93f930b2974708ee53f1d91cee4541038371bf84c41a407db5"} Mar 08 03:38:22.626766 master-0 kubenswrapper[13046]: I0308 03:38:22.626600 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:38:22.634264 master-0 kubenswrapper[13046]: I0308 03:38:22.634208 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-db-sync-xz5js" podStartSLOduration=4.153308993 podStartE2EDuration="28.634176169s" podCreationTimestamp="2026-03-08 03:37:54 +0000 UTC" firstStartedPulling="2026-03-08 03:37:56.399373315 +0000 UTC m=+1478.478140552" lastFinishedPulling="2026-03-08 03:38:20.880240511 +0000 UTC m=+1502.959007728" observedRunningTime="2026-03-08 03:38:22.616150617 +0000 UTC m=+1504.694917854" watchObservedRunningTime="2026-03-08 03:38:22.634176169 +0000 UTC m=+1504.712943386" Mar 08 03:38:22.766824 master-0 kubenswrapper[13046]: W0308 03:38:22.766775 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a578f6f_9e06_42a0_b8c9_61f05d7a5c0c.slice/crio-0a43a375968f9a9908aa22ec6c9fbc2395fa319d73137232ba8081851785c5f1 WatchSource:0}: Error finding container 0a43a375968f9a9908aa22ec6c9fbc2395fa319d73137232ba8081851785c5f1: Status 404 returned error can't find the container with id 0a43a375968f9a9908aa22ec6c9fbc2395fa319d73137232ba8081851785c5f1 Mar 08 03:38:22.778651 master-0 kubenswrapper[13046]: I0308 03:38:22.778614 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:38:23.081469 master-0 kubenswrapper[13046]: I0308 03:38:23.080252 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db8dcf7d7-ct9xk"] Mar 08 03:38:23.081469 master-0 kubenswrapper[13046]: E0308 03:38:23.080749 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" containerName="keystone-bootstrap" Mar 08 03:38:23.081469 master-0 kubenswrapper[13046]: I0308 03:38:23.080764 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" containerName="keystone-bootstrap" Mar 08 03:38:23.081469 master-0 kubenswrapper[13046]: I0308 03:38:23.080940 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" containerName="keystone-bootstrap" Mar 08 03:38:23.090935 master-0 kubenswrapper[13046]: I0308 03:38:23.090378 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.102952 master-0 kubenswrapper[13046]: I0308 03:38:23.100394 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 03:38:23.102952 master-0 kubenswrapper[13046]: I0308 03:38:23.100737 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 03:38:23.102952 master-0 kubenswrapper[13046]: I0308 03:38:23.100884 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 03:38:23.102952 master-0 kubenswrapper[13046]: I0308 03:38:23.101022 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 03:38:23.104362 master-0 kubenswrapper[13046]: I0308 03:38:23.104336 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 03:38:23.159582 master-0 kubenswrapper[13046]: I0308 03:38:23.149657 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db8dcf7d7-ct9xk"] Mar 08 03:38:23.204311 master-0 kubenswrapper[13046]: I0308 03:38:23.204262 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-fernet-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204530 master-0 kubenswrapper[13046]: I0308 03:38:23.204336 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm8w6\" (UniqueName: \"kubernetes.io/projected/98bf3b85-b910-4088-a5b4-8aa77f503535-kube-api-access-hm8w6\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204530 master-0 kubenswrapper[13046]: I0308 03:38:23.204415 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-combined-ca-bundle\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204530 master-0 kubenswrapper[13046]: I0308 03:38:23.204431 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-internal-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204530 master-0 kubenswrapper[13046]: I0308 03:38:23.204469 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-config-data\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204530 master-0 kubenswrapper[13046]: I0308 03:38:23.204502 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-scripts\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204888 master-0 kubenswrapper[13046]: I0308 03:38:23.204864 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-public-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.204947 master-0 kubenswrapper[13046]: I0308 03:38:23.204900 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-credential-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.306230 master-0 kubenswrapper[13046]: I0308 03:38:23.306179 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-public-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.306230 master-0 kubenswrapper[13046]: I0308 03:38:23.306230 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-credential-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306275 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-fernet-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306303 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm8w6\" (UniqueName: \"kubernetes.io/projected/98bf3b85-b910-4088-a5b4-8aa77f503535-kube-api-access-hm8w6\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306368 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-combined-ca-bundle\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306384 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-internal-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306419 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-config-data\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.306440 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-scripts\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.309360 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-scripts\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.310613 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-internal-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.311637 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-config-data\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.313198 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-combined-ca-bundle\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.315756 master-0 kubenswrapper[13046]: I0308 03:38:23.313666 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-fernet-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.317132 master-0 kubenswrapper[13046]: I0308 03:38:23.317019 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-public-tls-certs\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.324458 master-0 kubenswrapper[13046]: I0308 03:38:23.321148 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98bf3b85-b910-4088-a5b4-8aa77f503535-credential-keys\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.324807 master-0 kubenswrapper[13046]: I0308 03:38:23.324783 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm8w6\" (UniqueName: \"kubernetes.io/projected/98bf3b85-b910-4088-a5b4-8aa77f503535-kube-api-access-hm8w6\") pod \"keystone-db8dcf7d7-ct9xk\" (UID: \"98bf3b85-b910-4088-a5b4-8aa77f503535\") " pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.513603 master-0 kubenswrapper[13046]: I0308 03:38:23.507037 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:23.630600 master-0 kubenswrapper[13046]: I0308 03:38:23.630423 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerStarted","Data":"8daaacbc9e146992705f48a1491ae2830c1e0864097b1f9b8700acd2cd0f8061"} Mar 08 03:38:23.635609 master-0 kubenswrapper[13046]: I0308 03:38:23.635541 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerStarted","Data":"0a43a375968f9a9908aa22ec6c9fbc2395fa319d73137232ba8081851785c5f1"} Mar 08 03:38:23.637679 master-0 kubenswrapper[13046]: I0308 03:38:23.637650 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerStarted","Data":"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb"} Mar 08 03:38:23.637752 master-0 kubenswrapper[13046]: I0308 03:38:23.637685 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerStarted","Data":"abbbe860644150bd69db3be6ba3f97c1a9ff4d0f279fe5b9b62e56433650fc51"} Mar 08 03:38:23.638866 master-0 kubenswrapper[13046]: I0308 03:38:23.638840 13046 generic.go:334] "Generic (PLEG): container finished" podID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerID="c3fb1d9c0b95ed93f930b2974708ee53f1d91cee4541038371bf84c41a407db5" exitCode=0 Mar 08 03:38:23.638934 master-0 kubenswrapper[13046]: I0308 03:38:23.638893 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerDied","Data":"c3fb1d9c0b95ed93f930b2974708ee53f1d91cee4541038371bf84c41a407db5"} Mar 08 03:38:23.650192 master-0 kubenswrapper[13046]: I0308 03:38:23.650135 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerStarted","Data":"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d"} Mar 08 03:38:24.040308 master-0 kubenswrapper[13046]: I0308 03:38:24.033028 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db8dcf7d7-ct9xk"] Mar 08 03:38:24.664464 master-0 kubenswrapper[13046]: I0308 03:38:24.664406 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerStarted","Data":"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab"} Mar 08 03:38:24.666539 master-0 kubenswrapper[13046]: I0308 03:38:24.666515 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerStarted","Data":"cc663a949dd9f36d6267702390a8d2a544cb5dcfd0ecc132d92141a48d2c6668"} Mar 08 03:38:24.666764 master-0 kubenswrapper[13046]: I0308 03:38:24.666726 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:24.666834 master-0 kubenswrapper[13046]: I0308 03:38:24.666772 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:24.669888 master-0 kubenswrapper[13046]: I0308 03:38:24.669442 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerStarted","Data":"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439"} Mar 08 03:38:24.672945 master-0 kubenswrapper[13046]: I0308 03:38:24.672895 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerStarted","Data":"a07e2e878060a43c1b69f035fd7659c0fcbd40f297fb45c9fe13e1436bf2d989"} Mar 08 03:38:24.690480 master-0 kubenswrapper[13046]: I0308 03:38:24.690381 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db8dcf7d7-ct9xk" event={"ID":"98bf3b85-b910-4088-a5b4-8aa77f503535","Type":"ContainerStarted","Data":"6c5973b67cbd8a26f546b0683373d3a9d74b19d409b5d4c19dbd70bafe0ce09a"} Mar 08 03:38:24.690480 master-0 kubenswrapper[13046]: I0308 03:38:24.690472 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db8dcf7d7-ct9xk" event={"ID":"98bf3b85-b910-4088-a5b4-8aa77f503535","Type":"ContainerStarted","Data":"0a2d512b198ea9d871b00cee4813208b589413bbdef11aa351e825c855444e1e"} Mar 08 03:38:24.690980 master-0 kubenswrapper[13046]: I0308 03:38:24.690942 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:24.772231 master-0 kubenswrapper[13046]: I0308 03:38:24.772147 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-external-api-0" podStartSLOduration=17.772122368 podStartE2EDuration="17.772122368s" podCreationTimestamp="2026-03-08 03:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:24.751473061 +0000 UTC m=+1506.830240278" watchObservedRunningTime="2026-03-08 03:38:24.772122368 +0000 UTC m=+1506.850889585" Mar 08 03:38:24.861844 master-0 kubenswrapper[13046]: I0308 03:38:24.861742 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-j4bdp" podStartSLOduration=4.206497254 podStartE2EDuration="19.861715394s" podCreationTimestamp="2026-03-08 03:38:05 +0000 UTC" firstStartedPulling="2026-03-08 03:38:06.61929651 +0000 UTC m=+1488.698063727" lastFinishedPulling="2026-03-08 03:38:22.27451465 +0000 UTC m=+1504.353281867" observedRunningTime="2026-03-08 03:38:24.841153609 +0000 UTC m=+1506.919920866" watchObservedRunningTime="2026-03-08 03:38:24.861715394 +0000 UTC m=+1506.940482651" Mar 08 03:38:24.963603 master-0 kubenswrapper[13046]: I0308 03:38:24.963436 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7bff74b894-bgwhf" podStartSLOduration=4.963415894 podStartE2EDuration="4.963415894s" podCreationTimestamp="2026-03-08 03:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:24.941317196 +0000 UTC m=+1507.020084423" watchObservedRunningTime="2026-03-08 03:38:24.963415894 +0000 UTC m=+1507.042183111" Mar 08 03:38:25.194704 master-0 kubenswrapper[13046]: I0308 03:38:25.194613 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-internal-api-0" podStartSLOduration=18.194585022 podStartE2EDuration="18.194585022s" podCreationTimestamp="2026-03-08 03:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:25.167586725 +0000 UTC m=+1507.246353952" watchObservedRunningTime="2026-03-08 03:38:25.194585022 +0000 UTC m=+1507.273352249" Mar 08 03:38:25.243208 master-0 kubenswrapper[13046]: I0308 03:38:25.243014 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db8dcf7d7-ct9xk" podStartSLOduration=3.242993638 podStartE2EDuration="3.242993638s" podCreationTimestamp="2026-03-08 03:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:25.238070218 +0000 UTC m=+1507.316837435" watchObservedRunningTime="2026-03-08 03:38:25.242993638 +0000 UTC m=+1507.321760855" Mar 08 03:38:25.469070 master-0 kubenswrapper[13046]: I0308 03:38:25.468993 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5b6ffc768d-hm56p"] Mar 08 03:38:25.470755 master-0 kubenswrapper[13046]: I0308 03:38:25.470731 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.493058 master-0 kubenswrapper[13046]: I0308 03:38:25.492999 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b6ffc768d-hm56p"] Mar 08 03:38:25.592725 master-0 kubenswrapper[13046]: I0308 03:38:25.592660 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnn9\" (UniqueName: \"kubernetes.io/projected/26d64446-1c0e-488a-b489-a05dfe5ad9a6-kube-api-access-8hnn9\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.592934 master-0 kubenswrapper[13046]: I0308 03:38:25.592759 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d64446-1c0e-488a-b489-a05dfe5ad9a6-logs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.592934 master-0 kubenswrapper[13046]: I0308 03:38:25.592787 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-internal-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.593129 master-0 kubenswrapper[13046]: I0308 03:38:25.593080 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-scripts\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.593202 master-0 kubenswrapper[13046]: I0308 03:38:25.593181 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-public-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.593399 master-0 kubenswrapper[13046]: I0308 03:38:25.593379 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-combined-ca-bundle\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.593640 master-0 kubenswrapper[13046]: I0308 03:38:25.593616 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-config-data\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.695830 master-0 kubenswrapper[13046]: I0308 03:38:25.695750 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-combined-ca-bundle\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696253 master-0 kubenswrapper[13046]: I0308 03:38:25.695957 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-config-data\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696253 master-0 kubenswrapper[13046]: I0308 03:38:25.696021 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnn9\" (UniqueName: \"kubernetes.io/projected/26d64446-1c0e-488a-b489-a05dfe5ad9a6-kube-api-access-8hnn9\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696253 master-0 kubenswrapper[13046]: I0308 03:38:25.696072 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d64446-1c0e-488a-b489-a05dfe5ad9a6-logs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696253 master-0 kubenswrapper[13046]: I0308 03:38:25.696099 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-internal-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696719 master-0 kubenswrapper[13046]: I0308 03:38:25.696550 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-scripts\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.696719 master-0 kubenswrapper[13046]: I0308 03:38:25.696655 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-public-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.698004 master-0 kubenswrapper[13046]: I0308 03:38:25.697954 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26d64446-1c0e-488a-b489-a05dfe5ad9a6-logs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.702138 master-0 kubenswrapper[13046]: I0308 03:38:25.702102 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-internal-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.702252 master-0 kubenswrapper[13046]: I0308 03:38:25.702181 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-config-data\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.704017 master-0 kubenswrapper[13046]: I0308 03:38:25.703978 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-public-tls-certs\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.705391 master-0 kubenswrapper[13046]: I0308 03:38:25.704781 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-scripts\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.706058 master-0 kubenswrapper[13046]: I0308 03:38:25.706023 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d64446-1c0e-488a-b489-a05dfe5ad9a6-combined-ca-bundle\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.707253 master-0 kubenswrapper[13046]: I0308 03:38:25.707219 13046 generic.go:334] "Generic (PLEG): container finished" podID="9fa4b3ae-ec1e-4819-a483-12a563171db2" containerID="4d71bda5b031b2d9a24e5b6a59a0fc20622fc76fecb76d2cacb2685b9dfd5b5b" exitCode=0 Mar 08 03:38:25.707366 master-0 kubenswrapper[13046]: I0308 03:38:25.707335 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k4mpt" event={"ID":"9fa4b3ae-ec1e-4819-a483-12a563171db2","Type":"ContainerDied","Data":"4d71bda5b031b2d9a24e5b6a59a0fc20622fc76fecb76d2cacb2685b9dfd5b5b"} Mar 08 03:38:25.717585 master-0 kubenswrapper[13046]: I0308 03:38:25.717452 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnn9\" (UniqueName: \"kubernetes.io/projected/26d64446-1c0e-488a-b489-a05dfe5ad9a6-kube-api-access-8hnn9\") pod \"placement-5b6ffc768d-hm56p\" (UID: \"26d64446-1c0e-488a-b489-a05dfe5ad9a6\") " pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:25.787746 master-0 kubenswrapper[13046]: I0308 03:38:25.787619 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:26.345282 master-0 kubenswrapper[13046]: W0308 03:38:26.345230 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d64446_1c0e_488a_b489_a05dfe5ad9a6.slice/crio-3b9846e7bb724c6f1aa485306326e3d8ea18c5112f7dd5d1ce122cae1dd2bf93 WatchSource:0}: Error finding container 3b9846e7bb724c6f1aa485306326e3d8ea18c5112f7dd5d1ce122cae1dd2bf93: Status 404 returned error can't find the container with id 3b9846e7bb724c6f1aa485306326e3d8ea18c5112f7dd5d1ce122cae1dd2bf93 Mar 08 03:38:26.362431 master-0 kubenswrapper[13046]: I0308 03:38:26.362354 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b6ffc768d-hm56p"] Mar 08 03:38:26.726025 master-0 kubenswrapper[13046]: I0308 03:38:26.725927 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b6ffc768d-hm56p" event={"ID":"26d64446-1c0e-488a-b489-a05dfe5ad9a6","Type":"ContainerStarted","Data":"5f085505a005d42115c739f884d1624099674417360cf6902127b2c8bd810ee9"} Mar 08 03:38:26.726025 master-0 kubenswrapper[13046]: I0308 03:38:26.726021 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b6ffc768d-hm56p" event={"ID":"26d64446-1c0e-488a-b489-a05dfe5ad9a6","Type":"ContainerStarted","Data":"3b9846e7bb724c6f1aa485306326e3d8ea18c5112f7dd5d1ce122cae1dd2bf93"} Mar 08 03:38:27.222860 master-0 kubenswrapper[13046]: I0308 03:38:27.222807 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:38:27.352034 master-0 kubenswrapper[13046]: I0308 03:38:27.351929 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config\") pod \"9fa4b3ae-ec1e-4819-a483-12a563171db2\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " Mar 08 03:38:27.352287 master-0 kubenswrapper[13046]: I0308 03:38:27.352103 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle\") pod \"9fa4b3ae-ec1e-4819-a483-12a563171db2\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " Mar 08 03:38:27.352287 master-0 kubenswrapper[13046]: I0308 03:38:27.352161 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnr6b\" (UniqueName: \"kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b\") pod \"9fa4b3ae-ec1e-4819-a483-12a563171db2\" (UID: \"9fa4b3ae-ec1e-4819-a483-12a563171db2\") " Mar 08 03:38:27.365008 master-0 kubenswrapper[13046]: I0308 03:38:27.364917 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b" (OuterVolumeSpecName: "kube-api-access-nnr6b") pod "9fa4b3ae-ec1e-4819-a483-12a563171db2" (UID: "9fa4b3ae-ec1e-4819-a483-12a563171db2"). InnerVolumeSpecName "kube-api-access-nnr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:27.396173 master-0 kubenswrapper[13046]: I0308 03:38:27.396026 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config" (OuterVolumeSpecName: "config") pod "9fa4b3ae-ec1e-4819-a483-12a563171db2" (UID: "9fa4b3ae-ec1e-4819-a483-12a563171db2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:27.398529 master-0 kubenswrapper[13046]: I0308 03:38:27.398441 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fa4b3ae-ec1e-4819-a483-12a563171db2" (UID: "9fa4b3ae-ec1e-4819-a483-12a563171db2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:27.486548 master-0 kubenswrapper[13046]: I0308 03:38:27.486473 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:27.486816 master-0 kubenswrapper[13046]: I0308 03:38:27.486795 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa4b3ae-ec1e-4819-a483-12a563171db2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:27.486946 master-0 kubenswrapper[13046]: I0308 03:38:27.486930 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnr6b\" (UniqueName: \"kubernetes.io/projected/9fa4b3ae-ec1e-4819-a483-12a563171db2-kube-api-access-nnr6b\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:27.743028 master-0 kubenswrapper[13046]: I0308 03:38:27.742133 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b6ffc768d-hm56p" event={"ID":"26d64446-1c0e-488a-b489-a05dfe5ad9a6","Type":"ContainerStarted","Data":"0eacb5624673e747b4fafd85c3658b3959dc6ef272daf752fe868784aa0a7bbf"} Mar 08 03:38:27.743760 master-0 kubenswrapper[13046]: I0308 03:38:27.743718 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:27.743760 master-0 kubenswrapper[13046]: I0308 03:38:27.743755 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:27.746570 master-0 kubenswrapper[13046]: I0308 03:38:27.746523 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k4mpt" event={"ID":"9fa4b3ae-ec1e-4819-a483-12a563171db2","Type":"ContainerDied","Data":"92492e587a4284a9aea4e12634213fc0832fdd3922361c4957c2868169c96b08"} Mar 08 03:38:27.746656 master-0 kubenswrapper[13046]: I0308 03:38:27.746573 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92492e587a4284a9aea4e12634213fc0832fdd3922361c4957c2868169c96b08" Mar 08 03:38:27.746718 master-0 kubenswrapper[13046]: I0308 03:38:27.746666 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k4mpt" Mar 08 03:38:27.793158 master-0 kubenswrapper[13046]: I0308 03:38:27.793047 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b6ffc768d-hm56p" podStartSLOduration=2.793018176 podStartE2EDuration="2.793018176s" podCreationTimestamp="2026-03-08 03:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:27.784089802 +0000 UTC m=+1509.862857019" watchObservedRunningTime="2026-03-08 03:38:27.793018176 +0000 UTC m=+1509.871785423" Mar 08 03:38:28.020658 master-0 kubenswrapper[13046]: I0308 03:38:28.016701 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:38:28.020658 master-0 kubenswrapper[13046]: E0308 03:38:28.020578 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa4b3ae-ec1e-4819-a483-12a563171db2" containerName="neutron-db-sync" Mar 08 03:38:28.020658 master-0 kubenswrapper[13046]: I0308 03:38:28.020601 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa4b3ae-ec1e-4819-a483-12a563171db2" containerName="neutron-db-sync" Mar 08 03:38:28.029228 master-0 kubenswrapper[13046]: I0308 03:38:28.020957 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa4b3ae-ec1e-4819-a483-12a563171db2" containerName="neutron-db-sync" Mar 08 03:38:28.029228 master-0 kubenswrapper[13046]: I0308 03:38:28.027222 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.061104 master-0 kubenswrapper[13046]: I0308 03:38:28.060973 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:38:28.112918 master-0 kubenswrapper[13046]: I0308 03:38:28.112609 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:38:28.114555 master-0 kubenswrapper[13046]: I0308 03:38:28.114274 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.126086 master-0 kubenswrapper[13046]: I0308 03:38:28.125966 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 03:38:28.127670 master-0 kubenswrapper[13046]: I0308 03:38:28.126523 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 03:38:28.127670 master-0 kubenswrapper[13046]: I0308 03:38:28.126636 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 03:38:28.207124 master-0 kubenswrapper[13046]: I0308 03:38:28.207062 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.207124 master-0 kubenswrapper[13046]: I0308 03:38:28.207126 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.207367 master-0 kubenswrapper[13046]: I0308 03:38:28.207190 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.207367 master-0 kubenswrapper[13046]: I0308 03:38:28.207213 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.207367 master-0 kubenswrapper[13046]: I0308 03:38:28.207239 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.207367 master-0 kubenswrapper[13046]: I0308 03:38:28.207289 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.207709 master-0 kubenswrapper[13046]: I0308 03:38:28.207683 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6p8h\" (UniqueName: \"kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.207778 master-0 kubenswrapper[13046]: I0308 03:38:28.207764 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.207831 master-0 kubenswrapper[13046]: I0308 03:38:28.207819 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.208211 master-0 kubenswrapper[13046]: I0308 03:38:28.208188 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.208251 master-0 kubenswrapper[13046]: I0308 03:38:28.208227 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbmjc\" (UniqueName: \"kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.211626 master-0 kubenswrapper[13046]: I0308 03:38:28.211574 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:38:28.309846 master-0 kubenswrapper[13046]: I0308 03:38:28.309721 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.309846 master-0 kubenswrapper[13046]: I0308 03:38:28.309804 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.309846 master-0 kubenswrapper[13046]: I0308 03:38:28.309839 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbmjc\" (UniqueName: \"kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.309880 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.309903 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.309936 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.309967 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.309989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.310031 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.310077 master-0 kubenswrapper[13046]: I0308 03:38:28.310048 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6p8h\" (UniqueName: \"kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.310308 master-0 kubenswrapper[13046]: I0308 03:38:28.310099 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.310789 master-0 kubenswrapper[13046]: I0308 03:38:28.310746 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.311696 master-0 kubenswrapper[13046]: I0308 03:38:28.311333 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.311696 master-0 kubenswrapper[13046]: I0308 03:38:28.311365 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.312135 master-0 kubenswrapper[13046]: I0308 03:38:28.312112 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.312740 master-0 kubenswrapper[13046]: I0308 03:38:28.312695 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.316430 master-0 kubenswrapper[13046]: I0308 03:38:28.315574 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.316430 master-0 kubenswrapper[13046]: I0308 03:38:28.316238 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.316430 master-0 kubenswrapper[13046]: I0308 03:38:28.316305 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.316741 master-0 kubenswrapper[13046]: I0308 03:38:28.316708 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.331963 master-0 kubenswrapper[13046]: I0308 03:38:28.331401 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6p8h\" (UniqueName: \"kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h\") pod \"neutron-67766cb894-9qmph\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.332433 master-0 kubenswrapper[13046]: I0308 03:38:28.332404 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbmjc\" (UniqueName: \"kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc\") pod \"dnsmasq-dns-9f86777f9-xlvvs\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.397305 master-0 kubenswrapper[13046]: I0308 03:38:28.397266 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:28.531578 master-0 kubenswrapper[13046]: I0308 03:38:28.531293 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:28.960544 master-0 kubenswrapper[13046]: I0308 03:38:28.960165 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:38:29.150418 master-0 kubenswrapper[13046]: I0308 03:38:29.150358 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:38:29.313990 master-0 kubenswrapper[13046]: I0308 03:38:29.313946 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.314142 master-0 kubenswrapper[13046]: I0308 03:38:29.314019 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.347684 master-0 kubenswrapper[13046]: I0308 03:38:29.347639 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.362814 master-0 kubenswrapper[13046]: I0308 03:38:29.362757 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.780321 master-0 kubenswrapper[13046]: I0308 03:38:29.780182 13046 generic.go:334] "Generic (PLEG): container finished" podID="6f99839b-bfe0-4190-824a-67227752928b" containerID="0ede9ed7177ba70869be5a31d4dca1a5a158e9fa655580e3c9d4316d4926397a" exitCode=0 Mar 08 03:38:29.780321 master-0 kubenswrapper[13046]: I0308 03:38:29.780237 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" event={"ID":"6f99839b-bfe0-4190-824a-67227752928b","Type":"ContainerDied","Data":"0ede9ed7177ba70869be5a31d4dca1a5a158e9fa655580e3c9d4316d4926397a"} Mar 08 03:38:29.780321 master-0 kubenswrapper[13046]: I0308 03:38:29.780299 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" event={"ID":"6f99839b-bfe0-4190-824a-67227752928b","Type":"ContainerStarted","Data":"c218598ee392a54cd00dba89307dbee95138f80335484f85f3d59a682e9def20"} Mar 08 03:38:29.783267 master-0 kubenswrapper[13046]: I0308 03:38:29.783198 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerStarted","Data":"170e168489adeb31e1817b93124fb91b272d50f96b75a47b83a8bfefbe03502b"} Mar 08 03:38:29.783267 master-0 kubenswrapper[13046]: I0308 03:38:29.783267 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerStarted","Data":"4f9d048d579f237ee3fe87b15b98098d4dd4101762f67f7139a1ec5c7df66f1d"} Mar 08 03:38:29.783414 master-0 kubenswrapper[13046]: I0308 03:38:29.783282 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerStarted","Data":"5c1f647c671a31c7cc2d2b24735c4a6e2196d11285baf266068e423673d00c8f"} Mar 08 03:38:29.783683 master-0 kubenswrapper[13046]: I0308 03:38:29.783649 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.783778 master-0 kubenswrapper[13046]: I0308 03:38:29.783697 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:29.869980 master-0 kubenswrapper[13046]: I0308 03:38:29.869466 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67766cb894-9qmph" podStartSLOduration=1.869445227 podStartE2EDuration="1.869445227s" podCreationTimestamp="2026-03-08 03:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:29.844937811 +0000 UTC m=+1511.923705028" watchObservedRunningTime="2026-03-08 03:38:29.869445227 +0000 UTC m=+1511.948212444" Mar 08 03:38:30.820783 master-0 kubenswrapper[13046]: I0308 03:38:30.820642 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" event={"ID":"6f99839b-bfe0-4190-824a-67227752928b","Type":"ContainerStarted","Data":"25be4938b5d33e4b7191c632118eb8b9f765491b4bad26b08cacaf2579ea690a"} Mar 08 03:38:30.822215 master-0 kubenswrapper[13046]: I0308 03:38:30.822180 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:30.822215 master-0 kubenswrapper[13046]: I0308 03:38:30.822217 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:30.842913 master-0 kubenswrapper[13046]: I0308 03:38:30.842867 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:30.842913 master-0 kubenswrapper[13046]: I0308 03:38:30.842915 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:30.887319 master-0 kubenswrapper[13046]: I0308 03:38:30.885783 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:30.906885 master-0 kubenswrapper[13046]: I0308 03:38:30.906818 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:30.948734 master-0 kubenswrapper[13046]: I0308 03:38:30.948650 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" podStartSLOduration=3.948618711 podStartE2EDuration="3.948618711s" podCreationTimestamp="2026-03-08 03:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:30.946580243 +0000 UTC m=+1513.025347460" watchObservedRunningTime="2026-03-08 03:38:30.948618711 +0000 UTC m=+1513.027385928" Mar 08 03:38:31.839169 master-0 kubenswrapper[13046]: I0308 03:38:31.839101 13046 generic.go:334] "Generic (PLEG): container finished" podID="888c100e-3bc9-45fa-a5a2-fe687ee09c1c" containerID="a300fa2a572f7a256930ec382dbb5fb8085e21791741be6fe4b0f8b9cabbf9a4" exitCode=0 Mar 08 03:38:31.840683 master-0 kubenswrapper[13046]: I0308 03:38:31.840638 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-db-sync-xz5js" event={"ID":"888c100e-3bc9-45fa-a5a2-fe687ee09c1c","Type":"ContainerDied","Data":"a300fa2a572f7a256930ec382dbb5fb8085e21791741be6fe4b0f8b9cabbf9a4"} Mar 08 03:38:31.840824 master-0 kubenswrapper[13046]: I0308 03:38:31.840806 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:31.842643 master-0 kubenswrapper[13046]: I0308 03:38:31.841846 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:31.957508 master-0 kubenswrapper[13046]: I0308 03:38:31.956016 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d8564b49-5mpt2"] Mar 08 03:38:31.959542 master-0 kubenswrapper[13046]: I0308 03:38:31.958057 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:31.966420 master-0 kubenswrapper[13046]: I0308 03:38:31.966376 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 03:38:31.966542 master-0 kubenswrapper[13046]: I0308 03:38:31.966479 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 03:38:32.007977 master-0 kubenswrapper[13046]: I0308 03:38:32.007852 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d8564b49-5mpt2"] Mar 08 03:38:32.035105 master-0 kubenswrapper[13046]: I0308 03:38:32.035013 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035322 master-0 kubenswrapper[13046]: I0308 03:38:32.035261 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-public-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035368 master-0 kubenswrapper[13046]: I0308 03:38:32.035330 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-httpd-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035442 master-0 kubenswrapper[13046]: I0308 03:38:32.035408 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8pqt\" (UniqueName: \"kubernetes.io/projected/262f4148-a42c-44b0-b736-023696c55964-kube-api-access-h8pqt\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035753 master-0 kubenswrapper[13046]: I0308 03:38:32.035722 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-ovndb-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035957 master-0 kubenswrapper[13046]: I0308 03:38:32.035923 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-internal-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.035994 master-0 kubenswrapper[13046]: I0308 03:38:32.035972 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-combined-ca-bundle\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138191 master-0 kubenswrapper[13046]: I0308 03:38:32.138148 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-internal-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138500 master-0 kubenswrapper[13046]: I0308 03:38:32.138447 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-combined-ca-bundle\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138658 master-0 kubenswrapper[13046]: I0308 03:38:32.138631 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138819 master-0 kubenswrapper[13046]: I0308 03:38:32.138798 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-public-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138868 master-0 kubenswrapper[13046]: I0308 03:38:32.138843 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-httpd-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.138958 master-0 kubenswrapper[13046]: I0308 03:38:32.138937 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8pqt\" (UniqueName: \"kubernetes.io/projected/262f4148-a42c-44b0-b736-023696c55964-kube-api-access-h8pqt\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.139291 master-0 kubenswrapper[13046]: I0308 03:38:32.139270 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-ovndb-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.142265 master-0 kubenswrapper[13046]: I0308 03:38:32.142227 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-internal-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.144374 master-0 kubenswrapper[13046]: I0308 03:38:32.143752 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.144374 master-0 kubenswrapper[13046]: I0308 03:38:32.144321 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-public-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.145633 master-0 kubenswrapper[13046]: I0308 03:38:32.145611 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-ovndb-tls-certs\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.152602 master-0 kubenswrapper[13046]: I0308 03:38:32.149178 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-combined-ca-bundle\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.156498 master-0 kubenswrapper[13046]: I0308 03:38:32.153028 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/262f4148-a42c-44b0-b736-023696c55964-httpd-config\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.163695 master-0 kubenswrapper[13046]: I0308 03:38:32.163628 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8pqt\" (UniqueName: \"kubernetes.io/projected/262f4148-a42c-44b0-b736-023696c55964-kube-api-access-h8pqt\") pod \"neutron-6d8564b49-5mpt2\" (UID: \"262f4148-a42c-44b0-b736-023696c55964\") " pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.350795 master-0 kubenswrapper[13046]: I0308 03:38:32.350738 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:32.610428 master-0 kubenswrapper[13046]: I0308 03:38:32.598667 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:32.610428 master-0 kubenswrapper[13046]: I0308 03:38:32.598752 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:38:32.610428 master-0 kubenswrapper[13046]: I0308 03:38:32.604947 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:38:32.991532 master-0 kubenswrapper[13046]: I0308 03:38:32.991345 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d8564b49-5mpt2"] Mar 08 03:38:33.306754 master-0 kubenswrapper[13046]: I0308 03:38:33.306710 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:38:33.390363 master-0 kubenswrapper[13046]: I0308 03:38:33.390314 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.390542 master-0 kubenswrapper[13046]: I0308 03:38:33.390454 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.390585 master-0 kubenswrapper[13046]: I0308 03:38:33.390565 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.390626 master-0 kubenswrapper[13046]: I0308 03:38:33.390608 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txz2\" (UniqueName: \"kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.390829 master-0 kubenswrapper[13046]: I0308 03:38:33.390805 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.390893 master-0 kubenswrapper[13046]: I0308 03:38:33.390878 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts\") pod \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\" (UID: \"888c100e-3bc9-45fa-a5a2-fe687ee09c1c\") " Mar 08 03:38:33.394106 master-0 kubenswrapper[13046]: I0308 03:38:33.394068 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts" (OuterVolumeSpecName: "scripts") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:33.397891 master-0 kubenswrapper[13046]: I0308 03:38:33.397830 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:33.398049 master-0 kubenswrapper[13046]: I0308 03:38:33.398009 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2" (OuterVolumeSpecName: "kube-api-access-5txz2") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "kube-api-access-5txz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:33.400791 master-0 kubenswrapper[13046]: I0308 03:38:33.400751 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:33.428628 master-0 kubenswrapper[13046]: I0308 03:38:33.425330 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:33.451843 master-0 kubenswrapper[13046]: I0308 03:38:33.450244 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data" (OuterVolumeSpecName: "config-data") pod "888c100e-3bc9-45fa-a5a2-fe687ee09c1c" (UID: "888c100e-3bc9-45fa-a5a2-fe687ee09c1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494004 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494053 13046 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494071 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5txz2\" (UniqueName: \"kubernetes.io/projected/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-kube-api-access-5txz2\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494083 13046 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494093 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.496622 master-0 kubenswrapper[13046]: I0308 03:38:33.494106 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c100e-3bc9-45fa-a5a2-fe687ee09c1c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:33.868580 master-0 kubenswrapper[13046]: I0308 03:38:33.868511 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d8564b49-5mpt2" event={"ID":"262f4148-a42c-44b0-b736-023696c55964","Type":"ContainerStarted","Data":"906a0a8478c35fa7dd011932e461c6455db063ff7baf570c1023c713f5c1de9a"} Mar 08 03:38:33.868580 master-0 kubenswrapper[13046]: I0308 03:38:33.868571 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d8564b49-5mpt2" event={"ID":"262f4148-a42c-44b0-b736-023696c55964","Type":"ContainerStarted","Data":"cba4e26e46ab565514e4d7509608165ad9bad54a11746508fce6724431cc4c71"} Mar 08 03:38:33.868580 master-0 kubenswrapper[13046]: I0308 03:38:33.868588 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d8564b49-5mpt2" event={"ID":"262f4148-a42c-44b0-b736-023696c55964","Type":"ContainerStarted","Data":"9586db05551b6b1edb25a7f1782836a20742b6adee0920c363149f774f83c4c2"} Mar 08 03:38:33.868934 master-0 kubenswrapper[13046]: I0308 03:38:33.868659 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:38:33.871205 master-0 kubenswrapper[13046]: I0308 03:38:33.871157 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-db-sync-xz5js" Mar 08 03:38:33.871205 master-0 kubenswrapper[13046]: I0308 03:38:33.871174 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:38:33.871205 master-0 kubenswrapper[13046]: I0308 03:38:33.871192 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:38:33.873668 master-0 kubenswrapper[13046]: I0308 03:38:33.871222 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-db-sync-xz5js" event={"ID":"888c100e-3bc9-45fa-a5a2-fe687ee09c1c","Type":"ContainerDied","Data":"e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0"} Mar 08 03:38:33.873668 master-0 kubenswrapper[13046]: I0308 03:38:33.871269 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e89648db380f8c90eca179e4d2d28ed45f381d183bb8f4e4c1fa97643f7943e0" Mar 08 03:38:33.919900 master-0 kubenswrapper[13046]: I0308 03:38:33.919827 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d8564b49-5mpt2" podStartSLOduration=2.919798857 podStartE2EDuration="2.919798857s" podCreationTimestamp="2026-03-08 03:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:33.89810312 +0000 UTC m=+1515.976870377" watchObservedRunningTime="2026-03-08 03:38:33.919798857 +0000 UTC m=+1515.998566074" Mar 08 03:38:34.236226 master-0 kubenswrapper[13046]: I0308 03:38:34.224661 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:34.236226 master-0 kubenswrapper[13046]: E0308 03:38:34.225252 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888c100e-3bc9-45fa-a5a2-fe687ee09c1c" containerName="cinder-e64dd-db-sync" Mar 08 03:38:34.236226 master-0 kubenswrapper[13046]: I0308 03:38:34.225266 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="888c100e-3bc9-45fa-a5a2-fe687ee09c1c" containerName="cinder-e64dd-db-sync" Mar 08 03:38:34.236226 master-0 kubenswrapper[13046]: I0308 03:38:34.225530 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="888c100e-3bc9-45fa-a5a2-fe687ee09c1c" containerName="cinder-e64dd-db-sync" Mar 08 03:38:34.253533 master-0 kubenswrapper[13046]: I0308 03:38:34.251625 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.256748 master-0 kubenswrapper[13046]: I0308 03:38:34.254950 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-scripts" Mar 08 03:38:34.256748 master-0 kubenswrapper[13046]: I0308 03:38:34.255524 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-config-data" Mar 08 03:38:34.256748 master-0 kubenswrapper[13046]: I0308 03:38:34.255695 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-scheduler-config-data" Mar 08 03:38:34.289600 master-0 kubenswrapper[13046]: I0308 03:38:34.289538 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:34.310986 master-0 kubenswrapper[13046]: I0308 03:38:34.309472 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:38:34.310986 master-0 kubenswrapper[13046]: I0308 03:38:34.309727 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="dnsmasq-dns" containerID="cri-o://25be4938b5d33e4b7191c632118eb8b9f765491b4bad26b08cacaf2579ea690a" gracePeriod=10 Mar 08 03:38:34.377387 master-0 kubenswrapper[13046]: I0308 03:38:34.377326 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kvkt\" (UniqueName: \"kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.377693 master-0 kubenswrapper[13046]: I0308 03:38:34.377669 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.377751 master-0 kubenswrapper[13046]: I0308 03:38:34.377718 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.377791 master-0 kubenswrapper[13046]: I0308 03:38:34.377755 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.377847 master-0 kubenswrapper[13046]: I0308 03:38:34.377830 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.377920 master-0 kubenswrapper[13046]: I0308 03:38:34.377903 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.415745 master-0 kubenswrapper[13046]: I0308 03:38:34.415616 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:38:34.419769 master-0 kubenswrapper[13046]: I0308 03:38:34.417868 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480329 master-0 kubenswrapper[13046]: I0308 03:38:34.480227 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480329 master-0 kubenswrapper[13046]: I0308 03:38:34.480323 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480370 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480399 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64bq\" (UniqueName: \"kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480421 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480440 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480490 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480522 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480547 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480579 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480597 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.480632 master-0 kubenswrapper[13046]: I0308 03:38:34.480628 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kvkt\" (UniqueName: \"kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.481712 master-0 kubenswrapper[13046]: I0308 03:38:34.481623 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.494739 master-0 kubenswrapper[13046]: I0308 03:38:34.493100 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.494739 master-0 kubenswrapper[13046]: I0308 03:38:34.493608 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.498167 master-0 kubenswrapper[13046]: I0308 03:38:34.498127 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.501497 master-0 kubenswrapper[13046]: I0308 03:38:34.499985 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.505695 master-0 kubenswrapper[13046]: I0308 03:38:34.505654 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kvkt\" (UniqueName: \"kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.508701 master-0 kubenswrapper[13046]: I0308 03:38:34.508665 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:38:34.567514 master-0 kubenswrapper[13046]: I0308 03:38:34.566730 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:34.574867 master-0 kubenswrapper[13046]: I0308 03:38:34.569794 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.583882 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.583979 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m64bq\" (UniqueName: \"kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.584006 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.584095 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.584147 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.584811 master-0 kubenswrapper[13046]: I0308 03:38:34.584171 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.595377 master-0 kubenswrapper[13046]: I0308 03:38:34.590180 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-backup-config-data" Mar 08 03:38:34.595377 master-0 kubenswrapper[13046]: I0308 03:38:34.591778 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.595377 master-0 kubenswrapper[13046]: I0308 03:38:34.592662 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.595377 master-0 kubenswrapper[13046]: I0308 03:38:34.593366 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.595377 master-0 kubenswrapper[13046]: I0308 03:38:34.593988 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.602748 master-0 kubenswrapper[13046]: I0308 03:38:34.601362 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.617264 master-0 kubenswrapper[13046]: I0308 03:38:34.617222 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64bq\" (UniqueName: \"kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq\") pod \"dnsmasq-dns-59fbcc86b7-tfsvc\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.636512 master-0 kubenswrapper[13046]: I0308 03:38:34.633740 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:34.648803 master-0 kubenswrapper[13046]: I0308 03:38:34.637988 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:34.648803 master-0 kubenswrapper[13046]: I0308 03:38:34.640232 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.657852 master-0 kubenswrapper[13046]: I0308 03:38:34.657201 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:34.673623 master-0 kubenswrapper[13046]: I0308 03:38:34.673564 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:34.675029 master-0 kubenswrapper[13046]: I0308 03:38:34.674530 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-volume-lvm-iscsi-config-data" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.685659 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.685709 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.685738 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.685781 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qjr\" (UniqueName: \"kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686037 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686058 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686078 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686100 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686122 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686141 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686160 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686180 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686219 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686238 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.687029 master-0 kubenswrapper[13046]: I0308 03:38:34.686261 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.690833 master-0 kubenswrapper[13046]: I0308 03:38:34.690137 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:34.773505 master-0 kubenswrapper[13046]: I0308 03:38:34.773309 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:34.785528 master-0 kubenswrapper[13046]: I0308 03:38:34.776937 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.789130 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-api-config-data" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790624 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790669 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790702 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790792 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790822 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790848 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790874 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790902 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790929 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790954 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.790982 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791007 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791031 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791054 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791080 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791126 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791149 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791174 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791201 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791234 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791279 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791316 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wq8\" (UniqueName: \"kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791344 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791375 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791407 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791451 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791510 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qjr\" (UniqueName: \"kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791553 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791595 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.791632 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.792984 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.793113 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.795510 master-0 kubenswrapper[13046]: I0308 03:38:34.794325 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.796773 master-0 kubenswrapper[13046]: I0308 03:38:34.795642 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.796773 master-0 kubenswrapper[13046]: I0308 03:38:34.795769 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.796773 master-0 kubenswrapper[13046]: I0308 03:38:34.796098 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.797512 master-0 kubenswrapper[13046]: I0308 03:38:34.797336 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.797512 master-0 kubenswrapper[13046]: I0308 03:38:34.797382 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.797512 master-0 kubenswrapper[13046]: I0308 03:38:34.797410 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.806817 master-0 kubenswrapper[13046]: I0308 03:38:34.799281 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.813958 master-0 kubenswrapper[13046]: I0308 03:38:34.812592 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.813958 master-0 kubenswrapper[13046]: I0308 03:38:34.812818 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:34.813958 master-0 kubenswrapper[13046]: I0308 03:38:34.813887 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.818514 master-0 kubenswrapper[13046]: I0308 03:38:34.814972 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.825733 master-0 kubenswrapper[13046]: I0308 03:38:34.823822 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qjr\" (UniqueName: \"kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.851506 master-0 kubenswrapper[13046]: I0308 03:38:34.844530 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898498 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wq8\" (UniqueName: \"kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898637 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898672 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898716 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtd97\" (UniqueName: \"kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898738 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898785 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898803 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898836 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898878 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898917 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898940 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898959 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.898985 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899003 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899018 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899036 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899058 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899082 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899120 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.899811 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.900163 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.900224 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.900418 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.900548 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901008 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901113 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901380 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901576 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901638 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901681 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901729 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.903575 master-0 kubenswrapper[13046]: I0308 03:38:34.901751 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.918583 master-0 kubenswrapper[13046]: I0308 03:38:34.914663 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.918583 master-0 kubenswrapper[13046]: I0308 03:38:34.918052 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.920639 master-0 kubenswrapper[13046]: I0308 03:38:34.920171 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.931503 master-0 kubenswrapper[13046]: I0308 03:38:34.926157 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:34.968406 master-0 kubenswrapper[13046]: I0308 03:38:34.968311 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wq8\" (UniqueName: \"kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026254 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026509 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtd97\" (UniqueName: \"kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026588 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026636 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026727 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026825 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.028509 master-0 kubenswrapper[13046]: I0308 03:38:35.026860 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.044226 master-0 kubenswrapper[13046]: I0308 03:38:35.037704 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.044226 master-0 kubenswrapper[13046]: I0308 03:38:35.042275 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:35.044226 master-0 kubenswrapper[13046]: I0308 03:38:35.043418 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.044714 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.047852 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.048240 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.049986 13046 generic.go:334] "Generic (PLEG): container finished" podID="6f99839b-bfe0-4190-824a-67227752928b" containerID="25be4938b5d33e4b7191c632118eb8b9f765491b4bad26b08cacaf2579ea690a" exitCode=0 Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.050070 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.050991 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" event={"ID":"6f99839b-bfe0-4190-824a-67227752928b","Type":"ContainerDied","Data":"25be4938b5d33e4b7191c632118eb8b9f765491b4bad26b08cacaf2579ea690a"} Mar 08 03:38:35.060223 master-0 kubenswrapper[13046]: I0308 03:38:35.052446 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.071751 master-0 kubenswrapper[13046]: I0308 03:38:35.071586 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.073523 master-0 kubenswrapper[13046]: I0308 03:38:35.072024 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.113510 master-0 kubenswrapper[13046]: I0308 03:38:35.092638 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtd97\" (UniqueName: \"kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97\") pod \"cinder-e64dd-api-0\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.125789 master-0 kubenswrapper[13046]: I0308 03:38:35.124918 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:35.125789 master-0 kubenswrapper[13046]: I0308 03:38:35.125438 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:38:35.239976 master-0 kubenswrapper[13046]: I0308 03:38:35.239671 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:35.358625 master-0 kubenswrapper[13046]: I0308 03:38:35.358308 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbmjc\" (UniqueName: \"kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.358819 master-0 kubenswrapper[13046]: I0308 03:38:35.358763 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.363120 master-0 kubenswrapper[13046]: I0308 03:38:35.362777 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.363120 master-0 kubenswrapper[13046]: I0308 03:38:35.362820 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.363120 master-0 kubenswrapper[13046]: I0308 03:38:35.362839 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.364382 master-0 kubenswrapper[13046]: I0308 03:38:35.363519 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0\") pod \"6f99839b-bfe0-4190-824a-67227752928b\" (UID: \"6f99839b-bfe0-4190-824a-67227752928b\") " Mar 08 03:38:35.382336 master-0 kubenswrapper[13046]: I0308 03:38:35.381589 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc" (OuterVolumeSpecName: "kube-api-access-pbmjc") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "kube-api-access-pbmjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:35.394832 master-0 kubenswrapper[13046]: I0308 03:38:35.394382 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:35.463662 master-0 kubenswrapper[13046]: I0308 03:38:35.463553 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:35.473029 master-0 kubenswrapper[13046]: I0308 03:38:35.472948 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbmjc\" (UniqueName: \"kubernetes.io/projected/6f99839b-bfe0-4190-824a-67227752928b-kube-api-access-pbmjc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.473029 master-0 kubenswrapper[13046]: I0308 03:38:35.473000 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.503093 master-0 kubenswrapper[13046]: I0308 03:38:35.502313 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:35.565715 master-0 kubenswrapper[13046]: I0308 03:38:35.562377 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:35.565715 master-0 kubenswrapper[13046]: I0308 03:38:35.563769 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:35.580874 master-0 kubenswrapper[13046]: I0308 03:38:35.579603 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config" (OuterVolumeSpecName: "config") pod "6f99839b-bfe0-4190-824a-67227752928b" (UID: "6f99839b-bfe0-4190-824a-67227752928b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:35.581966 master-0 kubenswrapper[13046]: I0308 03:38:35.581871 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.581966 master-0 kubenswrapper[13046]: I0308 03:38:35.581915 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.581966 master-0 kubenswrapper[13046]: I0308 03:38:35.581930 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.581966 master-0 kubenswrapper[13046]: I0308 03:38:35.581942 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f99839b-bfe0-4190-824a-67227752928b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:35.605175 master-0 kubenswrapper[13046]: W0308 03:38:35.605131 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93dab66c_116f_4bae_8331_aad21f0e3232.slice/crio-96fac1dd612f94e3e245fbe9f6ea6c81f8f9a1375756073cf827277782922d7f WatchSource:0}: Error finding container 96fac1dd612f94e3e245fbe9f6ea6c81f8f9a1375756073cf827277782922d7f: Status 404 returned error can't find the container with id 96fac1dd612f94e3e245fbe9f6ea6c81f8f9a1375756073cf827277782922d7f Mar 08 03:38:35.606607 master-0 kubenswrapper[13046]: I0308 03:38:35.606539 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:38:35.832000 master-0 kubenswrapper[13046]: I0308 03:38:35.831940 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:35.854913 master-0 kubenswrapper[13046]: W0308 03:38:35.854658 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eaebe4d_a790_446a_b2a9_492235a0054a.slice/crio-a2c2ac1475f02a523be8b985bec60dc4fab2113e74c93cfd6c3ff0a63851ab05 WatchSource:0}: Error finding container a2c2ac1475f02a523be8b985bec60dc4fab2113e74c93cfd6c3ff0a63851ab05: Status 404 returned error can't find the container with id a2c2ac1475f02a523be8b985bec60dc4fab2113e74c93cfd6c3ff0a63851ab05 Mar 08 03:38:36.056514 master-0 kubenswrapper[13046]: I0308 03:38:36.053694 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:36.127006 master-0 kubenswrapper[13046]: I0308 03:38:36.126733 13046 generic.go:334] "Generic (PLEG): container finished" podID="93dab66c-116f-4bae-8331-aad21f0e3232" containerID="755e79d4c0de53579ed2b528e4569e56f490113a2394552739a16926fb9a4f93" exitCode=0 Mar 08 03:38:36.139751 master-0 kubenswrapper[13046]: I0308 03:38:36.138913 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:38:36.152195 master-0 kubenswrapper[13046]: I0308 03:38:36.152117 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" event={"ID":"93dab66c-116f-4bae-8331-aad21f0e3232","Type":"ContainerDied","Data":"755e79d4c0de53579ed2b528e4569e56f490113a2394552739a16926fb9a4f93"} Mar 08 03:38:36.152195 master-0 kubenswrapper[13046]: I0308 03:38:36.152161 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:36.152195 master-0 kubenswrapper[13046]: I0308 03:38:36.152175 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" event={"ID":"93dab66c-116f-4bae-8331-aad21f0e3232","Type":"ContainerStarted","Data":"96fac1dd612f94e3e245fbe9f6ea6c81f8f9a1375756073cf827277782922d7f"} Mar 08 03:38:36.152195 master-0 kubenswrapper[13046]: I0308 03:38:36.152185 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" event={"ID":"6f99839b-bfe0-4190-824a-67227752928b","Type":"ContainerDied","Data":"c218598ee392a54cd00dba89307dbee95138f80335484f85f3d59a682e9def20"} Mar 08 03:38:36.152344 master-0 kubenswrapper[13046]: I0308 03:38:36.152197 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerStarted","Data":"a2c2ac1475f02a523be8b985bec60dc4fab2113e74c93cfd6c3ff0a63851ab05"} Mar 08 03:38:36.152344 master-0 kubenswrapper[13046]: I0308 03:38:36.152215 13046 scope.go:117] "RemoveContainer" containerID="25be4938b5d33e4b7191c632118eb8b9f765491b4bad26b08cacaf2579ea690a" Mar 08 03:38:36.175970 master-0 kubenswrapper[13046]: I0308 03:38:36.174999 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerStarted","Data":"4c1b0e2798bace1d1202c134309bf95c13c25023315b5d755db7b2d2e51c3c91"} Mar 08 03:38:36.252748 master-0 kubenswrapper[13046]: I0308 03:38:36.252708 13046 scope.go:117] "RemoveContainer" containerID="0ede9ed7177ba70869be5a31d4dca1a5a158e9fa655580e3c9d4316d4926397a" Mar 08 03:38:37.247579 master-0 kubenswrapper[13046]: I0308 03:38:37.245752 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerStarted","Data":"fe66ca59c1b4c6f824b6add3ae200332997619f2c2ec63da1cc011c1f6165ec1"} Mar 08 03:38:37.285515 master-0 kubenswrapper[13046]: I0308 03:38:37.283816 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" event={"ID":"93dab66c-116f-4bae-8331-aad21f0e3232","Type":"ContainerStarted","Data":"a0e23c53745b817ec21b0155dce83f4beb43b47f0c72ddc93c1730ccb9295cff"} Mar 08 03:38:37.285515 master-0 kubenswrapper[13046]: I0308 03:38:37.285143 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:37.311524 master-0 kubenswrapper[13046]: I0308 03:38:37.308886 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerStarted","Data":"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7"} Mar 08 03:38:37.311524 master-0 kubenswrapper[13046]: I0308 03:38:37.308941 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerStarted","Data":"74d69d80e383a5ce9e6199a64801b214d6db05de3ccb267ec4034a6ca9420de7"} Mar 08 03:38:37.328438 master-0 kubenswrapper[13046]: I0308 03:38:37.327391 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" podStartSLOduration=3.327371422 podStartE2EDuration="3.327371422s" podCreationTimestamp="2026-03-08 03:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:37.325944811 +0000 UTC m=+1519.404712028" watchObservedRunningTime="2026-03-08 03:38:37.327371422 +0000 UTC m=+1519.406138629" Mar 08 03:38:37.331748 master-0 kubenswrapper[13046]: I0308 03:38:37.331662 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerStarted","Data":"cf728eb93965e092742d443f14a39dc03f0ca012c7cba1a95f37c1f8d2a4467b"} Mar 08 03:38:38.339094 master-0 kubenswrapper[13046]: I0308 03:38:38.338742 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:38.379012 master-0 kubenswrapper[13046]: I0308 03:38:38.376589 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerStarted","Data":"6342152f984168d919ef5fa1286278cb9413ed83eda8fd2b7998b3d35bb84d28"} Mar 08 03:38:38.379012 master-0 kubenswrapper[13046]: I0308 03:38:38.376649 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerStarted","Data":"f9e62e88529d946780a5432526af1e7d5b0baa4fe1e430920afdf01329579f0f"} Mar 08 03:38:38.383973 master-0 kubenswrapper[13046]: I0308 03:38:38.383919 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerStarted","Data":"baf826ad2282fd1c3b073c46066bd9da956986084af2c0666802ac2ceb1ef30c"} Mar 08 03:38:38.387466 master-0 kubenswrapper[13046]: I0308 03:38:38.387426 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerStarted","Data":"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793"} Mar 08 03:38:38.388890 master-0 kubenswrapper[13046]: I0308 03:38:38.388721 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:38.411440 master-0 kubenswrapper[13046]: I0308 03:38:38.397780 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerStarted","Data":"bbf279cf01b589ce4e2e57f2d9b5d2e6fbff7e3d5373ec0cc9118fef94380e56"} Mar 08 03:38:38.459384 master-0 kubenswrapper[13046]: I0308 03:38:38.459245 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-api-0" podStartSLOduration=4.459227272 podStartE2EDuration="4.459227272s" podCreationTimestamp="2026-03-08 03:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:38.430858256 +0000 UTC m=+1520.509625473" watchObservedRunningTime="2026-03-08 03:38:38.459227272 +0000 UTC m=+1520.537994489" Mar 08 03:38:38.500086 master-0 kubenswrapper[13046]: I0308 03:38:38.491827 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" podStartSLOduration=3.554203466 podStartE2EDuration="4.491805628s" podCreationTimestamp="2026-03-08 03:38:34 +0000 UTC" firstStartedPulling="2026-03-08 03:38:35.856707113 +0000 UTC m=+1517.935474320" lastFinishedPulling="2026-03-08 03:38:36.794309265 +0000 UTC m=+1518.873076482" observedRunningTime="2026-03-08 03:38:38.476941696 +0000 UTC m=+1520.555708923" watchObservedRunningTime="2026-03-08 03:38:38.491805628 +0000 UTC m=+1520.570572845" Mar 08 03:38:39.408492 master-0 kubenswrapper[13046]: I0308 03:38:39.408401 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-api-0" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-e64dd-api-log" containerID="cri-o://beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" gracePeriod=30 Mar 08 03:38:39.409575 master-0 kubenswrapper[13046]: I0308 03:38:39.409548 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerStarted","Data":"be1cd4c67d10adc9de41c61f4aa0ea032750727e4708861e4545f82b9e5e2f9d"} Mar 08 03:38:39.411200 master-0 kubenswrapper[13046]: I0308 03:38:39.411175 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-api-0" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-api" containerID="cri-o://92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" gracePeriod=30 Mar 08 03:38:39.438347 master-0 kubenswrapper[13046]: I0308 03:38:39.438263 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-scheduler-0" podStartSLOduration=4.476085361 podStartE2EDuration="5.438244331s" podCreationTimestamp="2026-03-08 03:38:34 +0000 UTC" firstStartedPulling="2026-03-08 03:38:35.402704612 +0000 UTC m=+1517.481471819" lastFinishedPulling="2026-03-08 03:38:36.364863582 +0000 UTC m=+1518.443630789" observedRunningTime="2026-03-08 03:38:39.430780959 +0000 UTC m=+1521.509548176" watchObservedRunningTime="2026-03-08 03:38:39.438244331 +0000 UTC m=+1521.517011548" Mar 08 03:38:39.487632 master-0 kubenswrapper[13046]: I0308 03:38:39.487565 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-backup-0" podStartSLOduration=4.340705354 podStartE2EDuration="5.487551132s" podCreationTimestamp="2026-03-08 03:38:34 +0000 UTC" firstStartedPulling="2026-03-08 03:38:36.28489993 +0000 UTC m=+1518.363667147" lastFinishedPulling="2026-03-08 03:38:37.431745708 +0000 UTC m=+1519.510512925" observedRunningTime="2026-03-08 03:38:39.483709483 +0000 UTC m=+1521.562476700" watchObservedRunningTime="2026-03-08 03:38:39.487551132 +0000 UTC m=+1521.566318359" Mar 08 03:38:39.639084 master-0 kubenswrapper[13046]: I0308 03:38:39.638544 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:40.044575 master-0 kubenswrapper[13046]: I0308 03:38:40.043893 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:40.049516 master-0 kubenswrapper[13046]: I0308 03:38:40.049219 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:40.214959 master-0 kubenswrapper[13046]: I0308 03:38:40.214909 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285613 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285788 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285833 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285880 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtd97\" (UniqueName: \"kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285897 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.285947 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.286009 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts\") pod \"10ce7c67-df2d-4130-965f-b03780b12e7e\" (UID: \"10ce7c67-df2d-4130-965f-b03780b12e7e\") " Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.288108 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs" (OuterVolumeSpecName: "logs") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.289302 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.292070 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts" (OuterVolumeSpecName: "scripts") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.293298 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:40.296604 master-0 kubenswrapper[13046]: I0308 03:38:40.294611 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97" (OuterVolumeSpecName: "kube-api-access-gtd97") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "kube-api-access-gtd97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:40.333604 master-0 kubenswrapper[13046]: I0308 03:38:40.325748 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:40.337014 master-0 kubenswrapper[13046]: I0308 03:38:40.336959 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data" (OuterVolumeSpecName: "config-data") pod "10ce7c67-df2d-4130-965f-b03780b12e7e" (UID: "10ce7c67-df2d-4130-965f-b03780b12e7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:40.388785 master-0 kubenswrapper[13046]: I0308 03:38:40.388714 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.388785 master-0 kubenswrapper[13046]: I0308 03:38:40.388770 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10ce7c67-df2d-4130-965f-b03780b12e7e-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.388785 master-0 kubenswrapper[13046]: I0308 03:38:40.388786 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.389040 master-0 kubenswrapper[13046]: I0308 03:38:40.388799 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtd97\" (UniqueName: \"kubernetes.io/projected/10ce7c67-df2d-4130-965f-b03780b12e7e-kube-api-access-gtd97\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.389040 master-0 kubenswrapper[13046]: I0308 03:38:40.388814 13046 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/10ce7c67-df2d-4130-965f-b03780b12e7e-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.389040 master-0 kubenswrapper[13046]: I0308 03:38:40.388826 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.389040 master-0 kubenswrapper[13046]: I0308 03:38:40.388836 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10ce7c67-df2d-4130-965f-b03780b12e7e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421598 13046 generic.go:334] "Generic (PLEG): container finished" podID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerID="92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" exitCode=0 Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421632 13046 generic.go:334] "Generic (PLEG): container finished" podID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerID="beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" exitCode=143 Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421663 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerDied","Data":"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793"} Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421839 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerDied","Data":"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7"} Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421862 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"10ce7c67-df2d-4130-965f-b03780b12e7e","Type":"ContainerDied","Data":"74d69d80e383a5ce9e6199a64801b214d6db05de3ccb267ec4034a6ca9420de7"} Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421781 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.422087 master-0 kubenswrapper[13046]: I0308 03:38:40.421899 13046 scope.go:117] "RemoveContainer" containerID="92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" Mar 08 03:38:40.495942 master-0 kubenswrapper[13046]: I0308 03:38:40.495908 13046 scope.go:117] "RemoveContainer" containerID="beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" Mar 08 03:38:40.506701 master-0 kubenswrapper[13046]: I0308 03:38:40.506649 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:40.535180 master-0 kubenswrapper[13046]: I0308 03:38:40.532707 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:40.546758 master-0 kubenswrapper[13046]: I0308 03:38:40.546193 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: E0308 03:38:40.546980 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-api" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.546998 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-api" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: E0308 03:38:40.547008 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-e64dd-api-log" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547015 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-e64dd-api-log" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: E0308 03:38:40.547027 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="dnsmasq-dns" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547033 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="dnsmasq-dns" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: E0308 03:38:40.547045 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="init" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547051 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="init" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547236 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f99839b-bfe0-4190-824a-67227752928b" containerName="dnsmasq-dns" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547272 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-e64dd-api-log" Mar 08 03:38:40.547910 master-0 kubenswrapper[13046]: I0308 03:38:40.547283 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" containerName="cinder-api" Mar 08 03:38:40.549340 master-0 kubenswrapper[13046]: I0308 03:38:40.549211 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.552229 master-0 kubenswrapper[13046]: I0308 03:38:40.551695 13046 scope.go:117] "RemoveContainer" containerID="92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" Mar 08 03:38:40.553277 master-0 kubenswrapper[13046]: I0308 03:38:40.553219 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 03:38:40.554063 master-0 kubenswrapper[13046]: E0308 03:38:40.553995 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793\": container with ID starting with 92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793 not found: ID does not exist" containerID="92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" Mar 08 03:38:40.554063 master-0 kubenswrapper[13046]: I0308 03:38:40.554049 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793"} err="failed to get container status \"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793\": rpc error: code = NotFound desc = could not find container \"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793\": container with ID starting with 92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793 not found: ID does not exist" Mar 08 03:38:40.554205 master-0 kubenswrapper[13046]: I0308 03:38:40.554077 13046 scope.go:117] "RemoveContainer" containerID="beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" Mar 08 03:38:40.554450 master-0 kubenswrapper[13046]: E0308 03:38:40.554359 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7\": container with ID starting with beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7 not found: ID does not exist" containerID="beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" Mar 08 03:38:40.554450 master-0 kubenswrapper[13046]: I0308 03:38:40.554385 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7"} err="failed to get container status \"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7\": rpc error: code = NotFound desc = could not find container \"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7\": container with ID starting with beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7 not found: ID does not exist" Mar 08 03:38:40.554450 master-0 kubenswrapper[13046]: I0308 03:38:40.554399 13046 scope.go:117] "RemoveContainer" containerID="92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793" Mar 08 03:38:40.554725 master-0 kubenswrapper[13046]: I0308 03:38:40.554589 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793"} err="failed to get container status \"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793\": rpc error: code = NotFound desc = could not find container \"92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793\": container with ID starting with 92516a849b6c73a74da44347cfc6574d58d58ccd59710c901848a8b8eec71793 not found: ID does not exist" Mar 08 03:38:40.554725 master-0 kubenswrapper[13046]: I0308 03:38:40.554605 13046 scope.go:117] "RemoveContainer" containerID="beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7" Mar 08 03:38:40.555003 master-0 kubenswrapper[13046]: I0308 03:38:40.554929 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-api-config-data" Mar 08 03:38:40.555003 master-0 kubenswrapper[13046]: I0308 03:38:40.554971 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7"} err="failed to get container status \"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7\": rpc error: code = NotFound desc = could not find container \"beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7\": container with ID starting with beb2e3c1af85c10591d121ecb12e1ba7eb3e007178fca63dd42d0d49f2153ed7 not found: ID does not exist" Mar 08 03:38:40.555334 master-0 kubenswrapper[13046]: I0308 03:38:40.555274 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 03:38:40.567804 master-0 kubenswrapper[13046]: I0308 03:38:40.567679 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:40.698782 master-0 kubenswrapper[13046]: I0308 03:38:40.698654 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.698825 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.698921 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19902afd-81da-486c-96b3-27e0b46b2d38-logs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.698955 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.698986 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.699015 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-internal-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699157 master-0 kubenswrapper[13046]: I0308 03:38:40.699046 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-public-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699609 master-0 kubenswrapper[13046]: I0308 03:38:40.699184 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19902afd-81da-486c-96b3-27e0b46b2d38-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.699609 master-0 kubenswrapper[13046]: I0308 03:38:40.699220 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rqbv\" (UniqueName: \"kubernetes.io/projected/19902afd-81da-486c-96b3-27e0b46b2d38-kube-api-access-7rqbv\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.801368 master-0 kubenswrapper[13046]: I0308 03:38:40.801209 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19902afd-81da-486c-96b3-27e0b46b2d38-logs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.801775 master-0 kubenswrapper[13046]: I0308 03:38:40.801742 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.801980 master-0 kubenswrapper[13046]: I0308 03:38:40.801951 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.802231 master-0 kubenswrapper[13046]: I0308 03:38:40.802203 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-internal-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.802398 master-0 kubenswrapper[13046]: I0308 03:38:40.802070 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19902afd-81da-486c-96b3-27e0b46b2d38-logs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.802559 master-0 kubenswrapper[13046]: I0308 03:38:40.802530 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-public-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.803106 master-0 kubenswrapper[13046]: I0308 03:38:40.803077 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19902afd-81da-486c-96b3-27e0b46b2d38-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.803380 master-0 kubenswrapper[13046]: I0308 03:38:40.803174 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/19902afd-81da-486c-96b3-27e0b46b2d38-etc-machine-id\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.803565 master-0 kubenswrapper[13046]: I0308 03:38:40.803535 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rqbv\" (UniqueName: \"kubernetes.io/projected/19902afd-81da-486c-96b3-27e0b46b2d38-kube-api-access-7rqbv\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.804207 master-0 kubenswrapper[13046]: I0308 03:38:40.804176 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.804636 master-0 kubenswrapper[13046]: I0308 03:38:40.804585 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.806076 master-0 kubenswrapper[13046]: I0308 03:38:40.805987 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-combined-ca-bundle\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.807664 master-0 kubenswrapper[13046]: I0308 03:38:40.807381 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-public-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.807664 master-0 kubenswrapper[13046]: I0308 03:38:40.807452 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-internal-tls-certs\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.817220 master-0 kubenswrapper[13046]: I0308 03:38:40.817163 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-scripts\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.818853 master-0 kubenswrapper[13046]: I0308 03:38:40.818811 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data-custom\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.822948 master-0 kubenswrapper[13046]: I0308 03:38:40.822843 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19902afd-81da-486c-96b3-27e0b46b2d38-config-data\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:40.979613 master-0 kubenswrapper[13046]: I0308 03:38:40.977678 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rqbv\" (UniqueName: \"kubernetes.io/projected/19902afd-81da-486c-96b3-27e0b46b2d38-kube-api-access-7rqbv\") pod \"cinder-e64dd-api-0\" (UID: \"19902afd-81da-486c-96b3-27e0b46b2d38\") " pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:41.195696 master-0 kubenswrapper[13046]: I0308 03:38:41.195620 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:41.889802 master-0 kubenswrapper[13046]: I0308 03:38:41.883517 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-api-0"] Mar 08 03:38:42.136566 master-0 kubenswrapper[13046]: I0308 03:38:42.135303 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ce7c67-df2d-4130-965f-b03780b12e7e" path="/var/lib/kubelet/pods/10ce7c67-df2d-4130-965f-b03780b12e7e/volumes" Mar 08 03:38:42.451664 master-0 kubenswrapper[13046]: I0308 03:38:42.451530 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"19902afd-81da-486c-96b3-27e0b46b2d38","Type":"ContainerStarted","Data":"9cea5a19d20da4c4f4783720eadb81fe5f782ac9808085ac533de3418887ce62"} Mar 08 03:38:42.453587 master-0 kubenswrapper[13046]: I0308 03:38:42.453531 13046 generic.go:334] "Generic (PLEG): container finished" podID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerID="a07e2e878060a43c1b69f035fd7659c0fcbd40f297fb45c9fe13e1436bf2d989" exitCode=0 Mar 08 03:38:42.453587 master-0 kubenswrapper[13046]: I0308 03:38:42.453579 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerDied","Data":"a07e2e878060a43c1b69f035fd7659c0fcbd40f297fb45c9fe13e1436bf2d989"} Mar 08 03:38:43.473857 master-0 kubenswrapper[13046]: I0308 03:38:43.473306 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"19902afd-81da-486c-96b3-27e0b46b2d38","Type":"ContainerStarted","Data":"1901ceb4051fb7f0a89930c30ac44edb3758f265b948d80414f64951bea66996"} Mar 08 03:38:43.473857 master-0 kubenswrapper[13046]: I0308 03:38:43.473390 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-api-0" event={"ID":"19902afd-81da-486c-96b3-27e0b46b2d38","Type":"ContainerStarted","Data":"861d756bce7142567f56ee1c269f7a4316262edb9ed71d424c230ec6614b542f"} Mar 08 03:38:43.474766 master-0 kubenswrapper[13046]: I0308 03:38:43.473973 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:43.518189 master-0 kubenswrapper[13046]: I0308 03:38:43.518075 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-api-0" podStartSLOduration=3.518056398 podStartE2EDuration="3.518056398s" podCreationTimestamp="2026-03-08 03:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:43.500572691 +0000 UTC m=+1525.579339928" watchObservedRunningTime="2026-03-08 03:38:43.518056398 +0000 UTC m=+1525.596823615" Mar 08 03:38:44.070043 master-0 kubenswrapper[13046]: I0308 03:38:44.069991 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138378 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138459 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138529 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138602 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138646 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.139616 master-0 kubenswrapper[13046]: I0308 03:38:44.138682 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtvgg\" (UniqueName: \"kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg\") pod \"4248bb04-5f13-4afc-9263-49f3c929cd50\" (UID: \"4248bb04-5f13-4afc-9263-49f3c929cd50\") " Mar 08 03:38:44.143158 master-0 kubenswrapper[13046]: I0308 03:38:44.142782 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg" (OuterVolumeSpecName: "kube-api-access-qtvgg") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "kube-api-access-qtvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:44.143158 master-0 kubenswrapper[13046]: I0308 03:38:44.143082 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 03:38:44.143836 master-0 kubenswrapper[13046]: I0308 03:38:44.143575 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:38:44.171422 master-0 kubenswrapper[13046]: I0308 03:38:44.169601 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts" (OuterVolumeSpecName: "scripts") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:44.179632 master-0 kubenswrapper[13046]: I0308 03:38:44.179583 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data" (OuterVolumeSpecName: "config-data") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:44.222792 master-0 kubenswrapper[13046]: I0308 03:38:44.222734 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4248bb04-5f13-4afc-9263-49f3c929cd50" (UID: "4248bb04-5f13-4afc-9263-49f3c929cd50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:44.242353 master-0 kubenswrapper[13046]: I0308 03:38:44.242298 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.242353 master-0 kubenswrapper[13046]: I0308 03:38:44.242347 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.242353 master-0 kubenswrapper[13046]: I0308 03:38:44.242357 13046 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4248bb04-5f13-4afc-9263-49f3c929cd50-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.242793 master-0 kubenswrapper[13046]: I0308 03:38:44.242366 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.242793 master-0 kubenswrapper[13046]: I0308 03:38:44.242375 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4248bb04-5f13-4afc-9263-49f3c929cd50-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.242793 master-0 kubenswrapper[13046]: I0308 03:38:44.242386 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtvgg\" (UniqueName: \"kubernetes.io/projected/4248bb04-5f13-4afc-9263-49f3c929cd50-kube-api-access-qtvgg\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:44.489879 master-0 kubenswrapper[13046]: I0308 03:38:44.489825 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j4bdp" Mar 08 03:38:44.491380 master-0 kubenswrapper[13046]: I0308 03:38:44.491337 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j4bdp" event={"ID":"4248bb04-5f13-4afc-9263-49f3c929cd50","Type":"ContainerDied","Data":"580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb"} Mar 08 03:38:44.491380 master-0 kubenswrapper[13046]: I0308 03:38:44.491370 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="580dc84a16c29d1e113f4318652d769ef2204ee3dec0ed3db4c444a2de7db2bb" Mar 08 03:38:44.691814 master-0 kubenswrapper[13046]: I0308 03:38:44.691738 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:38:44.834430 master-0 kubenswrapper[13046]: I0308 03:38:44.825502 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:38:44.834430 master-0 kubenswrapper[13046]: I0308 03:38:44.825831 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="dnsmasq-dns" containerID="cri-o://3e6d6d7f08325e7296da53a6fb4b5387b33fb898f4e52f930fd04a5dc0514c5c" gracePeriod=10 Mar 08 03:38:44.963875 master-0 kubenswrapper[13046]: I0308 03:38:44.962832 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:45.219899 master-0 kubenswrapper[13046]: I0308 03:38:45.207185 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:45.304101 master-0 kubenswrapper[13046]: I0308 03:38:45.301573 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-npd9v"] Mar 08 03:38:45.304101 master-0 kubenswrapper[13046]: E0308 03:38:45.302031 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerName="init" Mar 08 03:38:45.304101 master-0 kubenswrapper[13046]: I0308 03:38:45.302045 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerName="init" Mar 08 03:38:45.304101 master-0 kubenswrapper[13046]: E0308 03:38:45.302063 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerName="ironic-db-sync" Mar 08 03:38:45.304101 master-0 kubenswrapper[13046]: I0308 03:38:45.302070 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerName="ironic-db-sync" Mar 08 03:38:45.316829 master-0 kubenswrapper[13046]: I0308 03:38:45.316419 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" containerName="ironic-db-sync" Mar 08 03:38:45.322109 master-0 kubenswrapper[13046]: I0308 03:38:45.319038 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-9f967cb96-7vpvw"] Mar 08 03:38:45.322109 master-0 kubenswrapper[13046]: I0308 03:38:45.320207 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.322109 master-0 kubenswrapper[13046]: I0308 03:38:45.320455 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.322845 master-0 kubenswrapper[13046]: I0308 03:38:45.322776 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 08 03:38:45.418912 master-0 kubenswrapper[13046]: I0308 03:38:45.417557 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-npd9v"] Mar 08 03:38:45.424026 master-0 kubenswrapper[13046]: I0308 03:38:45.419565 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.424026 master-0 kubenswrapper[13046]: I0308 03:38:45.419682 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkj69\" (UniqueName: \"kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.424026 master-0 kubenswrapper[13046]: I0308 03:38:45.419731 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw4w\" (UniqueName: \"kubernetes.io/projected/d093b57a-247f-4d76-8ad2-659f459f5f1a-kube-api-access-bpw4w\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.424026 master-0 kubenswrapper[13046]: I0308 03:38:45.419756 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-combined-ca-bundle\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.424026 master-0 kubenswrapper[13046]: I0308 03:38:45.419775 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-config\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.456549 master-0 kubenswrapper[13046]: I0308 03:38:45.453105 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.209:5353: connect: connection refused" Mar 08 03:38:45.524162 master-0 kubenswrapper[13046]: I0308 03:38:45.522224 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw4w\" (UniqueName: \"kubernetes.io/projected/d093b57a-247f-4d76-8ad2-659f459f5f1a-kube-api-access-bpw4w\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.524162 master-0 kubenswrapper[13046]: I0308 03:38:45.522290 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-combined-ca-bundle\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.524162 master-0 kubenswrapper[13046]: I0308 03:38:45.522309 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-config\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.524162 master-0 kubenswrapper[13046]: I0308 03:38:45.522436 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.524162 master-0 kubenswrapper[13046]: I0308 03:38:45.522559 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkj69\" (UniqueName: \"kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.528177 master-0 kubenswrapper[13046]: I0308 03:38:45.525376 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.530165 master-0 kubenswrapper[13046]: I0308 03:38:45.529740 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-9f967cb96-7vpvw"] Mar 08 03:38:45.536500 master-0 kubenswrapper[13046]: I0308 03:38:45.535862 13046 generic.go:334] "Generic (PLEG): container finished" podID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerID="3e6d6d7f08325e7296da53a6fb4b5387b33fb898f4e52f930fd04a5dc0514c5c" exitCode=0 Mar 08 03:38:45.536500 master-0 kubenswrapper[13046]: I0308 03:38:45.536046 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-scheduler-0" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="cinder-scheduler" containerID="cri-o://baf826ad2282fd1c3b073c46066bd9da956986084af2c0666802ac2ceb1ef30c" gracePeriod=30 Mar 08 03:38:45.536500 master-0 kubenswrapper[13046]: I0308 03:38:45.536262 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" event={"ID":"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37","Type":"ContainerDied","Data":"3e6d6d7f08325e7296da53a6fb4b5387b33fb898f4e52f930fd04a5dc0514c5c"} Mar 08 03:38:45.536686 master-0 kubenswrapper[13046]: I0308 03:38:45.536657 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-scheduler-0" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="probe" containerID="cri-o://be1cd4c67d10adc9de41c61f4aa0ea032750727e4708861e4545f82b9e5e2f9d" gracePeriod=30 Mar 08 03:38:45.542225 master-0 kubenswrapper[13046]: I0308 03:38:45.541349 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-combined-ca-bundle\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.548059 master-0 kubenswrapper[13046]: I0308 03:38:45.546085 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d093b57a-247f-4d76-8ad2-659f459f5f1a-config\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.549523 master-0 kubenswrapper[13046]: I0308 03:38:45.548867 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkj69\" (UniqueName: \"kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69\") pod \"ironic-inspector-db-create-npd9v\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.598504 master-0 kubenswrapper[13046]: I0308 03:38:45.582666 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-a807-account-create-update-j48bd"] Mar 08 03:38:45.598504 master-0 kubenswrapper[13046]: I0308 03:38:45.594415 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.609654 master-0 kubenswrapper[13046]: I0308 03:38:45.605876 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 08 03:38:45.609654 master-0 kubenswrapper[13046]: I0308 03:38:45.606704 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw4w\" (UniqueName: \"kubernetes.io/projected/d093b57a-247f-4d76-8ad2-659f459f5f1a-kube-api-access-bpw4w\") pod \"ironic-neutron-agent-9f967cb96-7vpvw\" (UID: \"d093b57a-247f-4d76-8ad2-659f459f5f1a\") " pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.731463 master-0 kubenswrapper[13046]: I0308 03:38:45.731418 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ttj\" (UniqueName: \"kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.731614 master-0 kubenswrapper[13046]: I0308 03:38:45.731549 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.778554 master-0 kubenswrapper[13046]: I0308 03:38:45.773548 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-a807-account-create-update-j48bd"] Mar 08 03:38:45.780613 master-0 kubenswrapper[13046]: I0308 03:38:45.778904 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:45.807310 master-0 kubenswrapper[13046]: I0308 03:38:45.806900 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:38:45.809710 master-0 kubenswrapper[13046]: I0308 03:38:45.809659 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.816097 master-0 kubenswrapper[13046]: I0308 03:38:45.816033 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.826466 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832148 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832214 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832324 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832373 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832420 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ttj\" (UniqueName: \"kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832436 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.833552 master-0 kubenswrapper[13046]: I0308 03:38:45.832458 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.834338 master-0 kubenswrapper[13046]: I0308 03:38:45.834009 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.834419 master-0 kubenswrapper[13046]: I0308 03:38:45.834392 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hz7t\" (UniqueName: \"kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.850756 master-0 kubenswrapper[13046]: I0308 03:38:45.849759 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:38:45.873529 master-0 kubenswrapper[13046]: I0308 03:38:45.869940 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ttj\" (UniqueName: \"kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj\") pod \"ironic-inspector-a807-account-create-update-j48bd\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936004 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936100 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936131 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936175 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936198 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.936227 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hz7t\" (UniqueName: \"kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.937359 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.938267 master-0 kubenswrapper[13046]: I0308 03:38:45.938009 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.942529 master-0 kubenswrapper[13046]: I0308 03:38:45.939334 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.942529 master-0 kubenswrapper[13046]: I0308 03:38:45.939940 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.942529 master-0 kubenswrapper[13046]: I0308 03:38:45.940446 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.942529 master-0 kubenswrapper[13046]: I0308 03:38:45.941546 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:38:45.949903 master-0 kubenswrapper[13046]: I0308 03:38:45.949867 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:45.952838 master-0 kubenswrapper[13046]: I0308 03:38:45.952426 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 08 03:38:45.952838 master-0 kubenswrapper[13046]: I0308 03:38:45.952634 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 08 03:38:45.952957 master-0 kubenswrapper[13046]: I0308 03:38:45.952890 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 08 03:38:45.953326 master-0 kubenswrapper[13046]: I0308 03:38:45.953000 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 08 03:38:45.953326 master-0 kubenswrapper[13046]: I0308 03:38:45.953120 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 03:38:45.955040 master-0 kubenswrapper[13046]: I0308 03:38:45.955010 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:45.964766 master-0 kubenswrapper[13046]: I0308 03:38:45.964219 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hz7t\" (UniqueName: \"kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t\") pod \"dnsmasq-dns-85fc44959f-w8nqq\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.986506 master-0 kubenswrapper[13046]: I0308 03:38:45.971911 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:45.986506 master-0 kubenswrapper[13046]: I0308 03:38:45.984529 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.037647 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.037716 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.037772 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n9f\" (UniqueName: \"kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.038041 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.038060 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.038092 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.038117 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.042577 master-0 kubenswrapper[13046]: I0308 03:38:46.038144 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.059523 master-0 kubenswrapper[13046]: I0308 03:38:46.059052 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:46.100975 master-0 kubenswrapper[13046]: I0308 03:38:46.090615 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:46.159712 master-0 kubenswrapper[13046]: I0308 03:38:46.159677 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.160005 master-0 kubenswrapper[13046]: I0308 03:38:46.159989 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.160150 master-0 kubenswrapper[13046]: I0308 03:38:46.160136 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n9f\" (UniqueName: \"kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.166025 master-0 kubenswrapper[13046]: I0308 03:38:46.165993 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.166155 master-0 kubenswrapper[13046]: I0308 03:38:46.166139 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.166286 master-0 kubenswrapper[13046]: I0308 03:38:46.166273 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.166382 master-0 kubenswrapper[13046]: I0308 03:38:46.166369 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.166500 master-0 kubenswrapper[13046]: I0308 03:38:46.166473 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.176767 master-0 kubenswrapper[13046]: I0308 03:38:46.169252 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.176767 master-0 kubenswrapper[13046]: I0308 03:38:46.173787 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:38:46.177886 master-0 kubenswrapper[13046]: I0308 03:38:46.163682 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.192271 master-0 kubenswrapper[13046]: I0308 03:38:46.192226 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.205554 master-0 kubenswrapper[13046]: I0308 03:38:46.203062 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.206577 master-0 kubenswrapper[13046]: I0308 03:38:46.206547 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.239884 master-0 kubenswrapper[13046]: I0308 03:38:46.233712 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.254545 master-0 kubenswrapper[13046]: I0308 03:38:46.245633 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:46.262592 master-0 kubenswrapper[13046]: I0308 03:38:46.257823 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.275734 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.275817 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.275882 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtdnx\" (UniqueName: \"kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.276019 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.276118 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.287810 master-0 kubenswrapper[13046]: I0308 03:38:46.276227 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config\") pod \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\" (UID: \"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37\") " Mar 08 03:38:46.337720 master-0 kubenswrapper[13046]: I0308 03:38:46.323947 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx" (OuterVolumeSpecName: "kube-api-access-qtdnx") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "kube-api-access-qtdnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:46.337720 master-0 kubenswrapper[13046]: I0308 03:38:46.324527 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n9f\" (UniqueName: \"kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f\") pod \"ironic-7b5cd79955-pgwfv\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.387930 master-0 kubenswrapper[13046]: I0308 03:38:46.377236 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:46.387930 master-0 kubenswrapper[13046]: I0308 03:38:46.378444 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.387930 master-0 kubenswrapper[13046]: I0308 03:38:46.378467 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtdnx\" (UniqueName: \"kubernetes.io/projected/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-kube-api-access-qtdnx\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.416553 master-0 kubenswrapper[13046]: I0308 03:38:46.411545 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:46.416553 master-0 kubenswrapper[13046]: I0308 03:38:46.412291 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:46.438641 master-0 kubenswrapper[13046]: I0308 03:38:46.438591 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config" (OuterVolumeSpecName: "config") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:46.450521 master-0 kubenswrapper[13046]: I0308 03:38:46.450108 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:46.487620 master-0 kubenswrapper[13046]: I0308 03:38:46.487580 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.487620 master-0 kubenswrapper[13046]: I0308 03:38:46.487618 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.487734 master-0 kubenswrapper[13046]: I0308 03:38:46.487630 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.561522 master-0 kubenswrapper[13046]: I0308 03:38:46.561103 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" (UID: "69d2eb6e-a9b9-4d46-9f86-e7530d25ba37"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:46.607687 master-0 kubenswrapper[13046]: I0308 03:38:46.599544 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.646983 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-backup-0" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="cinder-backup" containerID="cri-o://f9e62e88529d946780a5432526af1e7d5b0baa4fe1e430920afdf01329579f0f" gracePeriod=30 Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.647532 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.649416 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fb5c97f-jv5vb" event={"ID":"69d2eb6e-a9b9-4d46-9f86-e7530d25ba37","Type":"ContainerDied","Data":"41c14e926ac9ee355bba3f68f69c7c33c04730468b1ab386f389d53703c637a7"} Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.649473 13046 scope.go:117] "RemoveContainer" containerID="3e6d6d7f08325e7296da53a6fb4b5387b33fb898f4e52f930fd04a5dc0514c5c" Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.649661 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-backup-0" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="probe" containerID="cri-o://6342152f984168d919ef5fa1286278cb9413ed83eda8fd2b7998b3d35bb84d28" gracePeriod=30 Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.649745 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="probe" containerID="cri-o://bbf279cf01b589ce4e2e57f2d9b5d2e6fbff7e3d5373ec0cc9118fef94380e56" gracePeriod=30 Mar 08 03:38:46.652560 master-0 kubenswrapper[13046]: I0308 03:38:46.649837 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="cinder-volume" containerID="cri-o://cf728eb93965e092742d443f14a39dc03f0ca012c7cba1a95f37c1f8d2a4467b" gracePeriod=30 Mar 08 03:38:46.723979 master-0 kubenswrapper[13046]: I0308 03:38:46.720046 13046 scope.go:117] "RemoveContainer" containerID="ef4fadf57b9bdef465be4bb304a7237f4ebcca696a585847ea78adfd97a00bf7" Mar 08 03:38:46.816584 master-0 kubenswrapper[13046]: I0308 03:38:46.816300 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:38:46.836532 master-0 kubenswrapper[13046]: I0308 03:38:46.830527 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68fb5c97f-jv5vb"] Mar 08 03:38:47.102506 master-0 kubenswrapper[13046]: I0308 03:38:47.099911 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-9f967cb96-7vpvw"] Mar 08 03:38:47.129962 master-0 kubenswrapper[13046]: W0308 03:38:47.125650 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd093b57a_247f_4d76_8ad2_659f459f5f1a.slice/crio-347355ae2f063bc9d59f107df68de2bcabf98bd3750c603cd845cd52f36e5f2c WatchSource:0}: Error finding container 347355ae2f063bc9d59f107df68de2bcabf98bd3750c603cd845cd52f36e5f2c: Status 404 returned error can't find the container with id 347355ae2f063bc9d59f107df68de2bcabf98bd3750c603cd845cd52f36e5f2c Mar 08 03:38:47.206522 master-0 kubenswrapper[13046]: I0308 03:38:47.203687 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-npd9v"] Mar 08 03:38:47.245342 master-0 kubenswrapper[13046]: W0308 03:38:47.245285 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca72dcd1_1efe_4383_8f9e_3959c2f0a3a3.slice/crio-b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7 WatchSource:0}: Error finding container b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7: Status 404 returned error can't find the container with id b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7 Mar 08 03:38:47.416841 master-0 kubenswrapper[13046]: I0308 03:38:47.416786 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-a807-account-create-update-j48bd"] Mar 08 03:38:47.433260 master-0 kubenswrapper[13046]: W0308 03:38:47.433209 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbcbe677_7983_4b97_9146_604451f6b8d6.slice/crio-dde5bfccb5356bf723103cc57a35fcf6cf372fd6dcaa3c73ba383e370431e0f2 WatchSource:0}: Error finding container dde5bfccb5356bf723103cc57a35fcf6cf372fd6dcaa3c73ba383e370431e0f2: Status 404 returned error can't find the container with id dde5bfccb5356bf723103cc57a35fcf6cf372fd6dcaa3c73ba383e370431e0f2 Mar 08 03:38:47.446073 master-0 kubenswrapper[13046]: I0308 03:38:47.441315 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:38:47.517831 master-0 kubenswrapper[13046]: I0308 03:38:47.517708 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 03:38:47.528034 master-0 kubenswrapper[13046]: E0308 03:38:47.527747 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="dnsmasq-dns" Mar 08 03:38:47.528034 master-0 kubenswrapper[13046]: I0308 03:38:47.527795 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="dnsmasq-dns" Mar 08 03:38:47.528034 master-0 kubenswrapper[13046]: E0308 03:38:47.527824 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="init" Mar 08 03:38:47.528034 master-0 kubenswrapper[13046]: I0308 03:38:47.527832 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="init" Mar 08 03:38:47.528259 master-0 kubenswrapper[13046]: I0308 03:38:47.528119 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" containerName="dnsmasq-dns" Mar 08 03:38:47.536503 master-0 kubenswrapper[13046]: I0308 03:38:47.536190 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 03:38:47.536503 master-0 kubenswrapper[13046]: I0308 03:38:47.536309 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 03:38:47.576031 master-0 kubenswrapper[13046]: I0308 03:38:47.567381 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 08 03:38:47.576031 master-0 kubenswrapper[13046]: I0308 03:38:47.569053 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 08 03:38:47.663526 master-0 kubenswrapper[13046]: I0308 03:38:47.661739 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" event={"ID":"12a2178d-e786-49e5-995e-ac4b269e0089","Type":"ContainerStarted","Data":"61406f7bf69bf5e61ddea83e02e64521e76e4aab5a2f454e6622db0b83270c75"} Mar 08 03:38:47.667820 master-0 kubenswrapper[13046]: I0308 03:38:47.666634 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" event={"ID":"fbcbe677-7983-4b97-9146-604451f6b8d6","Type":"ContainerStarted","Data":"dde5bfccb5356bf723103cc57a35fcf6cf372fd6dcaa3c73ba383e370431e0f2"} Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671761 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671850 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671878 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-850dab94-cae0-43ab-b073-157c4325bb66\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db0d9b43-2bfe-4d50-a38e-005ba4ae4e21\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671954 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/528b1064-a3b2-4ea4-8584-abeffdbedbbe-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671973 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-scripts\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.671998 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fmm\" (UniqueName: \"kubernetes.io/projected/528b1064-a3b2-4ea4-8584-abeffdbedbbe-kube-api-access-92fmm\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.672083 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.672111 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.674103 13046 generic.go:334] "Generic (PLEG): container finished" podID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerID="bbf279cf01b589ce4e2e57f2d9b5d2e6fbff7e3d5373ec0cc9118fef94380e56" exitCode=0 Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.674134 13046 generic.go:334] "Generic (PLEG): container finished" podID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerID="cf728eb93965e092742d443f14a39dc03f0ca012c7cba1a95f37c1f8d2a4467b" exitCode=0 Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.674204 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerDied","Data":"bbf279cf01b589ce4e2e57f2d9b5d2e6fbff7e3d5373ec0cc9118fef94380e56"} Mar 08 03:38:47.674645 master-0 kubenswrapper[13046]: I0308 03:38:47.674232 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerDied","Data":"cf728eb93965e092742d443f14a39dc03f0ca012c7cba1a95f37c1f8d2a4467b"} Mar 08 03:38:47.682940 master-0 kubenswrapper[13046]: I0308 03:38:47.682890 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-npd9v" event={"ID":"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3","Type":"ContainerStarted","Data":"b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7"} Mar 08 03:38:47.688104 master-0 kubenswrapper[13046]: I0308 03:38:47.688065 13046 generic.go:334] "Generic (PLEG): container finished" podID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerID="6342152f984168d919ef5fa1286278cb9413ed83eda8fd2b7998b3d35bb84d28" exitCode=0 Mar 08 03:38:47.688219 master-0 kubenswrapper[13046]: I0308 03:38:47.688152 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerDied","Data":"6342152f984168d919ef5fa1286278cb9413ed83eda8fd2b7998b3d35bb84d28"} Mar 08 03:38:47.705233 master-0 kubenswrapper[13046]: I0308 03:38:47.705180 13046 generic.go:334] "Generic (PLEG): container finished" podID="d514922b-0649-48be-863c-35148c350f12" containerID="be1cd4c67d10adc9de41c61f4aa0ea032750727e4708861e4545f82b9e5e2f9d" exitCode=0 Mar 08 03:38:47.705233 master-0 kubenswrapper[13046]: I0308 03:38:47.705215 13046 generic.go:334] "Generic (PLEG): container finished" podID="d514922b-0649-48be-863c-35148c350f12" containerID="baf826ad2282fd1c3b073c46066bd9da956986084af2c0666802ac2ceb1ef30c" exitCode=0 Mar 08 03:38:47.705383 master-0 kubenswrapper[13046]: I0308 03:38:47.705273 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerDied","Data":"be1cd4c67d10adc9de41c61f4aa0ea032750727e4708861e4545f82b9e5e2f9d"} Mar 08 03:38:47.705383 master-0 kubenswrapper[13046]: I0308 03:38:47.705303 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerDied","Data":"baf826ad2282fd1c3b073c46066bd9da956986084af2c0666802ac2ceb1ef30c"} Mar 08 03:38:47.707214 master-0 kubenswrapper[13046]: I0308 03:38:47.707169 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:38:47.708786 master-0 kubenswrapper[13046]: I0308 03:38:47.708353 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerStarted","Data":"347355ae2f063bc9d59f107df68de2bcabf98bd3750c603cd845cd52f36e5f2c"} Mar 08 03:38:47.774405 master-0 kubenswrapper[13046]: I0308 03:38:47.774000 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/528b1064-a3b2-4ea4-8584-abeffdbedbbe-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774405 master-0 kubenswrapper[13046]: I0308 03:38:47.774057 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-scripts\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774405 master-0 kubenswrapper[13046]: I0308 03:38:47.774288 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fmm\" (UniqueName: \"kubernetes.io/projected/528b1064-a3b2-4ea4-8584-abeffdbedbbe-kube-api-access-92fmm\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774804 master-0 kubenswrapper[13046]: I0308 03:38:47.774733 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774804 master-0 kubenswrapper[13046]: I0308 03:38:47.774799 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774917 master-0 kubenswrapper[13046]: I0308 03:38:47.774894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.774990 master-0 kubenswrapper[13046]: I0308 03:38:47.774978 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.775242 master-0 kubenswrapper[13046]: I0308 03:38:47.775207 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-850dab94-cae0-43ab-b073-157c4325bb66\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db0d9b43-2bfe-4d50-a38e-005ba4ae4e21\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.778239 master-0 kubenswrapper[13046]: I0308 03:38:47.778089 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.779624 master-0 kubenswrapper[13046]: I0308 03:38:47.778577 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:38:47.779624 master-0 kubenswrapper[13046]: I0308 03:38:47.778606 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-850dab94-cae0-43ab-b073-157c4325bb66\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db0d9b43-2bfe-4d50-a38e-005ba4ae4e21\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c2f99ae46fb7f0779eaae7c9c1a5a774c95b25d71ab83480a3b7489f8964ec54/globalmount\"" pod="openstack/ironic-conductor-0" Mar 08 03:38:47.801245 master-0 kubenswrapper[13046]: I0308 03:38:47.801202 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.801469 master-0 kubenswrapper[13046]: I0308 03:38:47.801429 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.802435 master-0 kubenswrapper[13046]: I0308 03:38:47.802391 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/528b1064-a3b2-4ea4-8584-abeffdbedbbe-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.803288 master-0 kubenswrapper[13046]: I0308 03:38:47.803132 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-scripts\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.803288 master-0 kubenswrapper[13046]: I0308 03:38:47.803184 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/528b1064-a3b2-4ea4-8584-abeffdbedbbe-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.807137 master-0 kubenswrapper[13046]: I0308 03:38:47.807061 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fmm\" (UniqueName: \"kubernetes.io/projected/528b1064-a3b2-4ea4-8584-abeffdbedbbe-kube-api-access-92fmm\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:47.999170 master-0 kubenswrapper[13046]: I0308 03:38:47.995439 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:48.091762 master-0 kubenswrapper[13046]: I0308 03:38:48.091651 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.091878 master-0 kubenswrapper[13046]: I0308 03:38:48.091797 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.091916 master-0 kubenswrapper[13046]: I0308 03:38:48.091874 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kvkt\" (UniqueName: \"kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.091916 master-0 kubenswrapper[13046]: I0308 03:38:48.091896 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.109579 master-0 kubenswrapper[13046]: I0308 03:38:48.092027 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.109579 master-0 kubenswrapper[13046]: I0308 03:38:48.092190 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts\") pod \"d514922b-0649-48be-863c-35148c350f12\" (UID: \"d514922b-0649-48be-863c-35148c350f12\") " Mar 08 03:38:48.109579 master-0 kubenswrapper[13046]: I0308 03:38:48.092982 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.117295 master-0 kubenswrapper[13046]: I0308 03:38:48.117255 13046 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d514922b-0649-48be-863c-35148c350f12-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.143149 master-0 kubenswrapper[13046]: I0308 03:38:48.143091 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d2eb6e-a9b9-4d46-9f86-e7530d25ba37" path="/var/lib/kubelet/pods/69d2eb6e-a9b9-4d46-9f86-e7530d25ba37/volumes" Mar 08 03:38:48.143742 master-0 kubenswrapper[13046]: I0308 03:38:48.143712 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt" (OuterVolumeSpecName: "kube-api-access-8kvkt") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "kube-api-access-8kvkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:48.147272 master-0 kubenswrapper[13046]: I0308 03:38:48.146866 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.149636 master-0 kubenswrapper[13046]: I0308 03:38:48.149593 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts" (OuterVolumeSpecName: "scripts") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.220718 master-0 kubenswrapper[13046]: I0308 03:38:48.220377 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.220939 master-0 kubenswrapper[13046]: I0308 03:38:48.220923 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kvkt\" (UniqueName: \"kubernetes.io/projected/d514922b-0649-48be-863c-35148c350f12-kube-api-access-8kvkt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.221003 master-0 kubenswrapper[13046]: I0308 03:38:48.220993 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.443843 master-0 kubenswrapper[13046]: I0308 03:38:48.443792 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data" (OuterVolumeSpecName: "config-data") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.462652 master-0 kubenswrapper[13046]: I0308 03:38:48.462600 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d514922b-0649-48be-863c-35148c350f12" (UID: "d514922b-0649-48be-863c-35148c350f12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.528655 master-0 kubenswrapper[13046]: I0308 03:38:48.528612 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.528655 master-0 kubenswrapper[13046]: I0308 03:38:48.528650 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d514922b-0649-48be-863c-35148c350f12-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.682507 master-0 kubenswrapper[13046]: I0308 03:38:48.680015 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:48.767215 master-0 kubenswrapper[13046]: I0308 03:38:48.767040 13046 generic.go:334] "Generic (PLEG): container finished" podID="ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" containerID="7bb131816473e380629730641e3d66107e61fb8bfdc5c308aa971996501c926a" exitCode=0 Mar 08 03:38:48.767215 master-0 kubenswrapper[13046]: I0308 03:38:48.767131 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-npd9v" event={"ID":"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3","Type":"ContainerDied","Data":"7bb131816473e380629730641e3d66107e61fb8bfdc5c308aa971996501c926a"} Mar 08 03:38:48.769581 master-0 kubenswrapper[13046]: I0308 03:38:48.769543 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerStarted","Data":"c1c20091a85971b691c2a3c64590385fb4d5e3b6312b9c304e791eeac6867806"} Mar 08 03:38:48.781615 master-0 kubenswrapper[13046]: I0308 03:38:48.776823 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"d514922b-0649-48be-863c-35148c350f12","Type":"ContainerDied","Data":"4c1b0e2798bace1d1202c134309bf95c13c25023315b5d755db7b2d2e51c3c91"} Mar 08 03:38:48.781615 master-0 kubenswrapper[13046]: I0308 03:38:48.776893 13046 scope.go:117] "RemoveContainer" containerID="be1cd4c67d10adc9de41c61f4aa0ea032750727e4708861e4545f82b9e5e2f9d" Mar 08 03:38:48.781615 master-0 kubenswrapper[13046]: I0308 03:38:48.777058 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:48.783087 master-0 kubenswrapper[13046]: I0308 03:38:48.783044 13046 generic.go:334] "Generic (PLEG): container finished" podID="12a2178d-e786-49e5-995e-ac4b269e0089" containerID="21c186c0e916378781593218621e52a7e8505b7e0fa558abdd4fb8dc561a04bd" exitCode=0 Mar 08 03:38:48.783264 master-0 kubenswrapper[13046]: I0308 03:38:48.783244 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" event={"ID":"12a2178d-e786-49e5-995e-ac4b269e0089","Type":"ContainerDied","Data":"21c186c0e916378781593218621e52a7e8505b7e0fa558abdd4fb8dc561a04bd"} Mar 08 03:38:48.792408 master-0 kubenswrapper[13046]: I0308 03:38:48.791833 13046 generic.go:334] "Generic (PLEG): container finished" podID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerID="4bdb6aac0720cf11b371222ad04927a426b562a9802427436c2e798dd12f9f70" exitCode=0 Mar 08 03:38:48.792408 master-0 kubenswrapper[13046]: I0308 03:38:48.792250 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" event={"ID":"fbcbe677-7983-4b97-9146-604451f6b8d6","Type":"ContainerDied","Data":"4bdb6aac0720cf11b371222ad04927a426b562a9802427436c2e798dd12f9f70"} Mar 08 03:38:48.795138 master-0 kubenswrapper[13046]: I0308 03:38:48.795106 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"8eaebe4d-a790-446a-b2a9-492235a0054a","Type":"ContainerDied","Data":"a2c2ac1475f02a523be8b985bec60dc4fab2113e74c93cfd6c3ff0a63851ab05"} Mar 08 03:38:48.795308 master-0 kubenswrapper[13046]: I0308 03:38:48.795283 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:48.808263 master-0 kubenswrapper[13046]: I0308 03:38:48.808229 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wq8\" (UniqueName: \"kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.808349 master-0 kubenswrapper[13046]: I0308 03:38:48.808294 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.809200 master-0 kubenswrapper[13046]: I0308 03:38:48.809064 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev" (OuterVolumeSpecName: "dev") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.809200 master-0 kubenswrapper[13046]: I0308 03:38:48.809134 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.809200 master-0 kubenswrapper[13046]: I0308 03:38:48.809201 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809263 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809283 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809318 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809362 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809386 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809402 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809427 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.810285 master-0 kubenswrapper[13046]: I0308 03:38:48.809466 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.813514 master-0 kubenswrapper[13046]: I0308 03:38:48.810689 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.813514 master-0 kubenswrapper[13046]: I0308 03:38:48.810774 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.813514 master-0 kubenswrapper[13046]: I0308 03:38:48.810850 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle\") pod \"8eaebe4d-a790-446a-b2a9-492235a0054a\" (UID: \"8eaebe4d-a790-446a-b2a9-492235a0054a\") " Mar 08 03:38:48.817173 master-0 kubenswrapper[13046]: I0308 03:38:48.817136 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.817302 master-0 kubenswrapper[13046]: I0308 03:38:48.817283 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.817508 master-0 kubenswrapper[13046]: I0308 03:38:48.817319 13046 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.817745 master-0 kubenswrapper[13046]: I0308 03:38:48.817499 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run" (OuterVolumeSpecName: "run") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.817863 master-0 kubenswrapper[13046]: I0308 03:38:48.817522 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.817997 master-0 kubenswrapper[13046]: I0308 03:38:48.817552 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.818078 master-0 kubenswrapper[13046]: I0308 03:38:48.817567 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.818151 master-0 kubenswrapper[13046]: I0308 03:38:48.817587 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.818208 master-0 kubenswrapper[13046]: I0308 03:38:48.817613 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.818287 master-0 kubenswrapper[13046]: I0308 03:38:48.817922 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys" (OuterVolumeSpecName: "sys") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:48.818355 master-0 kubenswrapper[13046]: I0308 03:38:48.818315 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts" (OuterVolumeSpecName: "scripts") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.841610 master-0 kubenswrapper[13046]: I0308 03:38:48.838031 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:48.841610 master-0 kubenswrapper[13046]: I0308 03:38:48.839554 13046 scope.go:117] "RemoveContainer" containerID="baf826ad2282fd1c3b073c46066bd9da956986084af2c0666802ac2ceb1ef30c" Mar 08 03:38:48.846082 master-0 kubenswrapper[13046]: I0308 03:38:48.846001 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8" (OuterVolumeSpecName: "kube-api-access-c8wq8") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "kube-api-access-c8wq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:48.912601 master-0 kubenswrapper[13046]: I0308 03:38:48.912517 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:48.921348 master-0 kubenswrapper[13046]: I0308 03:38:48.921291 13046 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.921701 master-0 kubenswrapper[13046]: I0308 03:38:48.921681 13046 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.921985 master-0 kubenswrapper[13046]: I0308 03:38:48.921945 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8wq8\" (UniqueName: \"kubernetes.io/projected/8eaebe4d-a790-446a-b2a9-492235a0054a-kube-api-access-c8wq8\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.922086 master-0 kubenswrapper[13046]: I0308 03:38:48.922073 13046 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.922178 master-0 kubenswrapper[13046]: I0308 03:38:48.922167 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.922257 master-0 kubenswrapper[13046]: I0308 03:38:48.922247 13046 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.923920 master-0 kubenswrapper[13046]: I0308 03:38:48.923885 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.924040 master-0 kubenswrapper[13046]: I0308 03:38:48.924029 13046 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.924130 master-0 kubenswrapper[13046]: I0308 03:38:48.924119 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.924213 master-0 kubenswrapper[13046]: I0308 03:38:48.924202 13046 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.924630 master-0 kubenswrapper[13046]: I0308 03:38:48.924617 13046 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.924781 master-0 kubenswrapper[13046]: I0308 03:38:48.924768 13046 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8eaebe4d-a790-446a-b2a9-492235a0054a-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:48.937093 master-0 kubenswrapper[13046]: I0308 03:38:48.937020 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:48.953605 master-0 kubenswrapper[13046]: I0308 03:38:48.953477 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: E0308 03:38:48.954050 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.954067 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: E0308 03:38:48.954096 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="cinder-volume" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.954102 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="cinder-volume" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: E0308 03:38:48.954121 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="cinder-scheduler" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.954127 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="cinder-scheduler" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: E0308 03:38:48.954159 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.954165 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.959022 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="cinder-volume" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.959097 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="cinder-scheduler" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.959123 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.959146 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="d514922b-0649-48be-863c-35148c350f12" containerName="probe" Mar 08 03:38:48.966735 master-0 kubenswrapper[13046]: I0308 03:38:48.960666 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:48.973616 master-0 kubenswrapper[13046]: I0308 03:38:48.973101 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:48.983619 master-0 kubenswrapper[13046]: I0308 03:38:48.983573 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-scheduler-config-data" Mar 08 03:38:48.983769 master-0 kubenswrapper[13046]: I0308 03:38:48.983746 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026430 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026495 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026669 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026704 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026720 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kstgt\" (UniqueName: \"kubernetes.io/projected/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-kube-api-access-kstgt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026744 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.030623 master-0 kubenswrapper[13046]: I0308 03:38:49.026805 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:49.090858 master-0 kubenswrapper[13046]: I0308 03:38:49.090804 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data" (OuterVolumeSpecName: "config-data") pod "8eaebe4d-a790-446a-b2a9-492235a0054a" (UID: "8eaebe4d-a790-446a-b2a9-492235a0054a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:49.128741 master-0 kubenswrapper[13046]: I0308 03:38:49.128568 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.128741 master-0 kubenswrapper[13046]: I0308 03:38:49.128640 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.128957 master-0 kubenswrapper[13046]: I0308 03:38:49.128815 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.128957 master-0 kubenswrapper[13046]: I0308 03:38:49.128866 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.128957 master-0 kubenswrapper[13046]: I0308 03:38:49.128884 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kstgt\" (UniqueName: \"kubernetes.io/projected/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-kube-api-access-kstgt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.128957 master-0 kubenswrapper[13046]: I0308 03:38:49.128921 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.129064 master-0 kubenswrapper[13046]: I0308 03:38:49.128997 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eaebe4d-a790-446a-b2a9-492235a0054a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:49.129223 master-0 kubenswrapper[13046]: I0308 03:38:49.129130 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-etc-machine-id\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.147191 master-0 kubenswrapper[13046]: I0308 03:38:49.147147 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data-custom\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.150215 master-0 kubenswrapper[13046]: I0308 03:38:49.150172 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-scripts\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.150612 master-0 kubenswrapper[13046]: I0308 03:38:49.150535 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-config-data\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.157662 master-0 kubenswrapper[13046]: I0308 03:38:49.154262 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:49.163096 master-0 kubenswrapper[13046]: I0308 03:38:49.162998 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kstgt\" (UniqueName: \"kubernetes.io/projected/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-kube-api-access-kstgt\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.164446 master-0 kubenswrapper[13046]: I0308 03:38:49.164412 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e69d8ee-8a68-41e2-8af4-af4d00ec9af2-combined-ca-bundle\") pod \"cinder-e64dd-scheduler-0\" (UID: \"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2\") " pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.212564 master-0 kubenswrapper[13046]: I0308 03:38:49.212498 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:49.254691 master-0 kubenswrapper[13046]: I0308 03:38:49.254638 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:49.256600 master-0 kubenswrapper[13046]: I0308 03:38:49.256567 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.262142 master-0 kubenswrapper[13046]: I0308 03:38:49.262072 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:49.263955 master-0 kubenswrapper[13046]: I0308 03:38:49.263933 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-volume-lvm-iscsi-config-data" Mar 08 03:38:49.265959 master-0 kubenswrapper[13046]: I0308 03:38:49.265912 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-850dab94-cae0-43ab-b073-157c4325bb66\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db0d9b43-2bfe-4d50-a38e-005ba4ae4e21\") pod \"ironic-conductor-0\" (UID: \"528b1064-a3b2-4ea4-8584-abeffdbedbbe\") " pod="openstack/ironic-conductor-0" Mar 08 03:38:49.307649 master-0 kubenswrapper[13046]: I0308 03:38:49.305467 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346732 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346781 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346837 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs4kd\" (UniqueName: \"kubernetes.io/projected/bd0e3137-0784-440c-abba-948535a56e3b-kube-api-access-qs4kd\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346868 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346904 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346930 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346969 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.346988 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347054 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347075 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347131 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347210 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347244 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347281 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.352502 master-0 kubenswrapper[13046]: I0308 03:38:49.347305 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449099 master-0 kubenswrapper[13046]: I0308 03:38:49.449033 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449283 master-0 kubenswrapper[13046]: I0308 03:38:49.449154 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449283 master-0 kubenswrapper[13046]: I0308 03:38:49.449254 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-lib-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449371 master-0 kubenswrapper[13046]: I0308 03:38:49.449293 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449371 master-0 kubenswrapper[13046]: I0308 03:38:49.449355 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-machine-id\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449432 master-0 kubenswrapper[13046]: I0308 03:38:49.449329 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449432 master-0 kubenswrapper[13046]: I0308 03:38:49.449410 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449519 master-0 kubenswrapper[13046]: I0308 03:38:49.449445 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-dev\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449519 master-0 kubenswrapper[13046]: I0308 03:38:49.449505 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-lib-modules\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449589 master-0 kubenswrapper[13046]: I0308 03:38:49.449549 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449589 master-0 kubenswrapper[13046]: I0308 03:38:49.449567 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449648 master-0 kubenswrapper[13046]: I0308 03:38:49.449620 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-cinder\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449685 master-0 kubenswrapper[13046]: I0308 03:38:49.449658 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs4kd\" (UniqueName: \"kubernetes.io/projected/bd0e3137-0784-440c-abba-948535a56e3b-kube-api-access-qs4kd\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449751 master-0 kubenswrapper[13046]: I0308 03:38:49.449720 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-nvme\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449790 master-0 kubenswrapper[13046]: I0308 03:38:49.449680 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449844 master-0 kubenswrapper[13046]: I0308 03:38:49.449826 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-sys\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449892 master-0 kubenswrapper[13046]: I0308 03:38:49.449876 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.449928 master-0 kubenswrapper[13046]: I0308 03:38:49.449901 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450190 master-0 kubenswrapper[13046]: I0308 03:38:49.450124 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-etc-iscsi\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450299 master-0 kubenswrapper[13046]: I0308 03:38:49.450274 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450339 master-0 kubenswrapper[13046]: I0308 03:38:49.450303 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450502 master-0 kubenswrapper[13046]: I0308 03:38:49.450467 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450561 master-0 kubenswrapper[13046]: I0308 03:38:49.450521 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450561 master-0 kubenswrapper[13046]: I0308 03:38:49.450538 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-run\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.450622 master-0 kubenswrapper[13046]: I0308 03:38:49.450534 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bd0e3137-0784-440c-abba-948535a56e3b-var-locks-brick\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.453809 master-0 kubenswrapper[13046]: I0308 03:38:49.453773 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-combined-ca-bundle\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.453946 master-0 kubenswrapper[13046]: I0308 03:38:49.453917 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.454405 master-0 kubenswrapper[13046]: I0308 03:38:49.454378 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-config-data-custom\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.460099 master-0 kubenswrapper[13046]: I0308 03:38:49.460058 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd0e3137-0784-440c-abba-948535a56e3b-scripts\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.466837 master-0 kubenswrapper[13046]: I0308 03:38:49.466796 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs4kd\" (UniqueName: \"kubernetes.io/projected/bd0e3137-0784-440c-abba-948535a56e3b-kube-api-access-qs4kd\") pod \"cinder-e64dd-volume-lvm-iscsi-0\" (UID: \"bd0e3137-0784-440c-abba-948535a56e3b\") " pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:49.474824 master-0 kubenswrapper[13046]: I0308 03:38:49.474780 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 08 03:38:49.538379 master-0 kubenswrapper[13046]: I0308 03:38:49.538335 13046 scope.go:117] "RemoveContainer" containerID="bbf279cf01b589ce4e2e57f2d9b5d2e6fbff7e3d5373ec0cc9118fef94380e56" Mar 08 03:38:49.635451 master-0 kubenswrapper[13046]: I0308 03:38:49.635405 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:50.105155 master-0 kubenswrapper[13046]: I0308 03:38:50.105098 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7859765cb5-ddvmn"] Mar 08 03:38:50.116649 master-0 kubenswrapper[13046]: I0308 03:38:50.112089 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.118292 master-0 kubenswrapper[13046]: I0308 03:38:50.117365 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 08 03:38:50.118292 master-0 kubenswrapper[13046]: I0308 03:38:50.117556 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 08 03:38:50.152517 master-0 kubenswrapper[13046]: I0308 03:38:50.144911 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaebe4d-a790-446a-b2a9-492235a0054a" path="/var/lib/kubelet/pods/8eaebe4d-a790-446a-b2a9-492235a0054a/volumes" Mar 08 03:38:50.152517 master-0 kubenswrapper[13046]: I0308 03:38:50.150877 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d514922b-0649-48be-863c-35148c350f12" path="/var/lib/kubelet/pods/d514922b-0649-48be-863c-35148c350f12/volumes" Mar 08 03:38:50.152517 master-0 kubenswrapper[13046]: I0308 03:38:50.151708 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7859765cb5-ddvmn"] Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289127 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-etc-podinfo\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289289 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289366 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-public-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289395 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-logs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289421 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-internal-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289468 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-custom\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289621 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-merged\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289653 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxkr\" (UniqueName: \"kubernetes.io/projected/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-kube-api-access-ccxkr\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289706 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-combined-ca-bundle\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.291402 master-0 kubenswrapper[13046]: I0308 03:38:50.289740 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-scripts\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.391985 master-0 kubenswrapper[13046]: I0308 03:38:50.391908 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-merged\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.391985 master-0 kubenswrapper[13046]: I0308 03:38:50.391955 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxkr\" (UniqueName: \"kubernetes.io/projected/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-kube-api-access-ccxkr\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.391985 master-0 kubenswrapper[13046]: I0308 03:38:50.391994 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-combined-ca-bundle\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.391985 master-0 kubenswrapper[13046]: I0308 03:38:50.392019 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-scripts\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393024 master-0 kubenswrapper[13046]: I0308 03:38:50.392982 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-etc-podinfo\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393334 master-0 kubenswrapper[13046]: I0308 03:38:50.393088 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393334 master-0 kubenswrapper[13046]: I0308 03:38:50.393136 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-public-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393334 master-0 kubenswrapper[13046]: I0308 03:38:50.393165 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-logs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393334 master-0 kubenswrapper[13046]: I0308 03:38:50.393188 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-internal-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.393334 master-0 kubenswrapper[13046]: I0308 03:38:50.393223 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-custom\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.395095 master-0 kubenswrapper[13046]: I0308 03:38:50.395056 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-logs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.395372 master-0 kubenswrapper[13046]: I0308 03:38:50.395346 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-merged\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.395933 master-0 kubenswrapper[13046]: I0308 03:38:50.395905 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-combined-ca-bundle\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.397290 master-0 kubenswrapper[13046]: I0308 03:38:50.397205 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data-custom\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.398129 master-0 kubenswrapper[13046]: I0308 03:38:50.398045 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-scripts\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.399072 master-0 kubenswrapper[13046]: I0308 03:38:50.398536 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-internal-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.399238 master-0 kubenswrapper[13046]: I0308 03:38:50.399180 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-etc-podinfo\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.402324 master-0 kubenswrapper[13046]: I0308 03:38:50.402285 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-config-data\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.406571 master-0 kubenswrapper[13046]: I0308 03:38:50.406454 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-public-tls-certs\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.428909 master-0 kubenswrapper[13046]: I0308 03:38:50.427331 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxkr\" (UniqueName: \"kubernetes.io/projected/25cc77c3-42af-4d5e-bd93-a0d0c7e07092-kube-api-access-ccxkr\") pod \"ironic-7859765cb5-ddvmn\" (UID: \"25cc77c3-42af-4d5e-bd93-a0d0c7e07092\") " pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.441564 master-0 kubenswrapper[13046]: I0308 03:38:50.441455 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:50.498715 master-0 kubenswrapper[13046]: I0308 03:38:50.498677 13046 scope.go:117] "RemoveContainer" containerID="cf728eb93965e092742d443f14a39dc03f0ca012c7cba1a95f37c1f8d2a4467b" Mar 08 03:38:50.706281 master-0 kubenswrapper[13046]: I0308 03:38:50.705365 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:50.715255 master-0 kubenswrapper[13046]: I0308 03:38:50.715102 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:50.812881 master-0 kubenswrapper[13046]: I0308 03:38:50.812415 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts\") pod \"12a2178d-e786-49e5-995e-ac4b269e0089\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " Mar 08 03:38:50.819637 master-0 kubenswrapper[13046]: I0308 03:38:50.814227 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12a2178d-e786-49e5-995e-ac4b269e0089" (UID: "12a2178d-e786-49e5-995e-ac4b269e0089"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:50.819637 master-0 kubenswrapper[13046]: I0308 03:38:50.816800 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkj69\" (UniqueName: \"kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69\") pod \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " Mar 08 03:38:50.819637 master-0 kubenswrapper[13046]: I0308 03:38:50.816912 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts\") pod \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\" (UID: \"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3\") " Mar 08 03:38:50.819637 master-0 kubenswrapper[13046]: I0308 03:38:50.816999 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ttj\" (UniqueName: \"kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj\") pod \"12a2178d-e786-49e5-995e-ac4b269e0089\" (UID: \"12a2178d-e786-49e5-995e-ac4b269e0089\") " Mar 08 03:38:50.819637 master-0 kubenswrapper[13046]: I0308 03:38:50.817392 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12a2178d-e786-49e5-995e-ac4b269e0089-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:50.826838 master-0 kubenswrapper[13046]: I0308 03:38:50.826658 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69" (OuterVolumeSpecName: "kube-api-access-gkj69") pod "ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" (UID: "ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3"). InnerVolumeSpecName "kube-api-access-gkj69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:50.826999 master-0 kubenswrapper[13046]: I0308 03:38:50.826853 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj" (OuterVolumeSpecName: "kube-api-access-b9ttj") pod "12a2178d-e786-49e5-995e-ac4b269e0089" (UID: "12a2178d-e786-49e5-995e-ac4b269e0089"). InnerVolumeSpecName "kube-api-access-b9ttj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:50.834505 master-0 kubenswrapper[13046]: I0308 03:38:50.833821 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" (UID: "ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:38:50.870643 master-0 kubenswrapper[13046]: I0308 03:38:50.869785 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-npd9v" event={"ID":"ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3","Type":"ContainerDied","Data":"b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7"} Mar 08 03:38:50.870643 master-0 kubenswrapper[13046]: I0308 03:38:50.869834 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b874996c5388ae37885bc65182f02e98f1d74b79920eff68c598ff19da8b09b7" Mar 08 03:38:50.870643 master-0 kubenswrapper[13046]: I0308 03:38:50.869897 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-npd9v" Mar 08 03:38:50.903802 master-0 kubenswrapper[13046]: I0308 03:38:50.898683 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" event={"ID":"12a2178d-e786-49e5-995e-ac4b269e0089","Type":"ContainerDied","Data":"61406f7bf69bf5e61ddea83e02e64521e76e4aab5a2f454e6622db0b83270c75"} Mar 08 03:38:50.903802 master-0 kubenswrapper[13046]: I0308 03:38:50.898721 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61406f7bf69bf5e61ddea83e02e64521e76e4aab5a2f454e6622db0b83270c75" Mar 08 03:38:50.903802 master-0 kubenswrapper[13046]: I0308 03:38:50.898781 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-a807-account-create-update-j48bd" Mar 08 03:38:50.936474 master-0 kubenswrapper[13046]: I0308 03:38:50.936366 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkj69\" (UniqueName: \"kubernetes.io/projected/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-kube-api-access-gkj69\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:50.936474 master-0 kubenswrapper[13046]: I0308 03:38:50.936408 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:50.936474 master-0 kubenswrapper[13046]: I0308 03:38:50.936418 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ttj\" (UniqueName: \"kubernetes.io/projected/12a2178d-e786-49e5-995e-ac4b269e0089-kube-api-access-b9ttj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:51.359509 master-0 kubenswrapper[13046]: W0308 03:38:51.350943 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e69d8ee_8a68_41e2_8af4_af4d00ec9af2.slice/crio-9b0648fa85d88b314af81891c83cd4f91c6952efa4b670041847566a6f8d543c WatchSource:0}: Error finding container 9b0648fa85d88b314af81891c83cd4f91c6952efa4b670041847566a6f8d543c: Status 404 returned error can't find the container with id 9b0648fa85d88b314af81891c83cd4f91c6952efa4b670041847566a6f8d543c Mar 08 03:38:51.359509 master-0 kubenswrapper[13046]: I0308 03:38:51.356554 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-scheduler-0"] Mar 08 03:38:51.371565 master-0 kubenswrapper[13046]: I0308 03:38:51.368172 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 08 03:38:51.379472 master-0 kubenswrapper[13046]: W0308 03:38:51.375701 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod528b1064_a3b2_4ea4_8584_abeffdbedbbe.slice/crio-2a6c15f2b967ff5567128188b8fd0068e601346b9d58ec42da64099cdabaec47 WatchSource:0}: Error finding container 2a6c15f2b967ff5567128188b8fd0068e601346b9d58ec42da64099cdabaec47: Status 404 returned error can't find the container with id 2a6c15f2b967ff5567128188b8fd0068e601346b9d58ec42da64099cdabaec47 Mar 08 03:38:51.505175 master-0 kubenswrapper[13046]: I0308 03:38:51.504086 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-volume-lvm-iscsi-0"] Mar 08 03:38:51.515132 master-0 kubenswrapper[13046]: I0308 03:38:51.515070 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7859765cb5-ddvmn"] Mar 08 03:38:51.957753 master-0 kubenswrapper[13046]: I0308 03:38:51.957696 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerStarted","Data":"4c2490c757bd4af990c59c840fc0cceb1cf1204f9198bcc6a46f24a8b703cd6d"} Mar 08 03:38:51.959282 master-0 kubenswrapper[13046]: I0308 03:38:51.959251 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:51.968989 master-0 kubenswrapper[13046]: I0308 03:38:51.968944 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" event={"ID":"fbcbe677-7983-4b97-9146-604451f6b8d6","Type":"ContainerStarted","Data":"3c0972bc923fef035427b96163d52c64dcc7330036f87132230479f40575b42c"} Mar 08 03:38:51.970020 master-0 kubenswrapper[13046]: I0308 03:38:51.969991 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:38:51.972147 master-0 kubenswrapper[13046]: I0308 03:38:51.972122 13046 generic.go:334] "Generic (PLEG): container finished" podID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerID="f9e62e88529d946780a5432526af1e7d5b0baa4fe1e430920afdf01329579f0f" exitCode=0 Mar 08 03:38:51.972214 master-0 kubenswrapper[13046]: I0308 03:38:51.972165 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerDied","Data":"f9e62e88529d946780a5432526af1e7d5b0baa4fe1e430920afdf01329579f0f"} Mar 08 03:38:51.976962 master-0 kubenswrapper[13046]: I0308 03:38:51.976919 13046 generic.go:334] "Generic (PLEG): container finished" podID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerID="2b0f1164d47af747ce565f753aebedb8270b64d5394e72ef1a65378ef3976cbf" exitCode=0 Mar 08 03:38:51.978714 master-0 kubenswrapper[13046]: I0308 03:38:51.977213 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerDied","Data":"2b0f1164d47af747ce565f753aebedb8270b64d5394e72ef1a65378ef3976cbf"} Mar 08 03:38:51.982919 master-0 kubenswrapper[13046]: I0308 03:38:51.982818 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"2a6c15f2b967ff5567128188b8fd0068e601346b9d58ec42da64099cdabaec47"} Mar 08 03:38:52.008323 master-0 kubenswrapper[13046]: I0308 03:38:52.006669 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" podStartSLOduration=3.493942255 podStartE2EDuration="7.006646387s" podCreationTimestamp="2026-03-08 03:38:45 +0000 UTC" firstStartedPulling="2026-03-08 03:38:47.129196587 +0000 UTC m=+1529.207963794" lastFinishedPulling="2026-03-08 03:38:50.641900709 +0000 UTC m=+1532.720667926" observedRunningTime="2026-03-08 03:38:51.998758083 +0000 UTC m=+1534.077525300" watchObservedRunningTime="2026-03-08 03:38:52.006646387 +0000 UTC m=+1534.085413594" Mar 08 03:38:52.028113 master-0 kubenswrapper[13046]: I0308 03:38:52.025167 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"bd0e3137-0784-440c-abba-948535a56e3b","Type":"ContainerStarted","Data":"f3dd576595d21f8f09d9877925495f012cb864a0034688533949474c139d0176"} Mar 08 03:38:52.028272 master-0 kubenswrapper[13046]: I0308 03:38:52.028121 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7859765cb5-ddvmn" event={"ID":"25cc77c3-42af-4d5e-bd93-a0d0c7e07092","Type":"ContainerStarted","Data":"d5a687aea3a4ce667183235d43dab974ffe6c83750a6c83fc02c12f46fd3e23a"} Mar 08 03:38:52.032405 master-0 kubenswrapper[13046]: I0308 03:38:52.032360 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2","Type":"ContainerStarted","Data":"9b0648fa85d88b314af81891c83cd4f91c6952efa4b670041847566a6f8d543c"} Mar 08 03:38:52.045400 master-0 kubenswrapper[13046]: I0308 03:38:52.041298 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:52.056604 master-0 kubenswrapper[13046]: I0308 03:38:52.055834 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" podStartSLOduration=7.055815275 podStartE2EDuration="7.055815275s" podCreationTimestamp="2026-03-08 03:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:52.051028648 +0000 UTC m=+1534.129795885" watchObservedRunningTime="2026-03-08 03:38:52.055815275 +0000 UTC m=+1534.134582492" Mar 08 03:38:52.130418 master-0 kubenswrapper[13046]: I0308 03:38:52.130363 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.130640 master-0 kubenswrapper[13046]: I0308 03:38:52.130468 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.130640 master-0 kubenswrapper[13046]: I0308 03:38:52.130605 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.130710 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131091 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131163 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131211 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131238 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qjr\" (UniqueName: \"kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131342 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131397 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131417 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131513 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131550 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131680 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131713 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi\") pod \"77422f95-b335-44a2-a0f1-6bd0dcc99431\" (UID: \"77422f95-b335-44a2-a0f1-6bd0dcc99431\") " Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131790 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131786 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run" (OuterVolumeSpecName: "run") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131828 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131882 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev" (OuterVolumeSpecName: "dev") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131902 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131923 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys" (OuterVolumeSpecName: "sys") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131962 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.131997 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.133205 master-0 kubenswrapper[13046]: I0308 03:38:52.133009 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.134759 master-0 kubenswrapper[13046]: I0308 03:38:52.134738 13046 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134762 13046 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-sys\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134775 13046 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134787 13046 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134801 13046 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134813 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134827 master-0 kubenswrapper[13046]: I0308 03:38:52.134825 13046 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.134997 master-0 kubenswrapper[13046]: I0308 03:38:52.134835 13046 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-dev\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.135867 master-0 kubenswrapper[13046]: I0308 03:38:52.135845 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 03:38:52.192047 master-0 kubenswrapper[13046]: I0308 03:38:52.189974 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr" (OuterVolumeSpecName: "kube-api-access-44qjr") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "kube-api-access-44qjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:38:52.192230 master-0 kubenswrapper[13046]: I0308 03:38:52.192115 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:52.227868 master-0 kubenswrapper[13046]: I0308 03:38:52.227819 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts" (OuterVolumeSpecName: "scripts") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:52.256431 master-0 kubenswrapper[13046]: I0308 03:38:52.256392 13046 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.256623 master-0 kubenswrapper[13046]: I0308 03:38:52.256611 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qjr\" (UniqueName: \"kubernetes.io/projected/77422f95-b335-44a2-a0f1-6bd0dcc99431-kube-api-access-44qjr\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.256708 master-0 kubenswrapper[13046]: I0308 03:38:52.256698 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.256770 master-0 kubenswrapper[13046]: I0308 03:38:52.256760 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.256853 master-0 kubenswrapper[13046]: I0308 03:38:52.256843 13046 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/77422f95-b335-44a2-a0f1-6bd0dcc99431-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:52.979503 master-0 kubenswrapper[13046]: I0308 03:38:52.974680 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:52.995504 master-0 kubenswrapper[13046]: I0308 03:38:52.990269 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:53.090901 master-0 kubenswrapper[13046]: I0308 03:38:53.090725 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"77422f95-b335-44a2-a0f1-6bd0dcc99431","Type":"ContainerDied","Data":"fe66ca59c1b4c6f824b6add3ae200332997619f2c2ec63da1cc011c1f6165ec1"} Mar 08 03:38:53.090901 master-0 kubenswrapper[13046]: I0308 03:38:53.090787 13046 scope.go:117] "RemoveContainer" containerID="6342152f984168d919ef5fa1286278cb9413ed83eda8fd2b7998b3d35bb84d28" Mar 08 03:38:53.091031 master-0 kubenswrapper[13046]: I0308 03:38:53.090935 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:53.109910 master-0 kubenswrapper[13046]: I0308 03:38:53.109869 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerStarted","Data":"a5b6711eff6dd2e62e4f814e13dab71a094483239358ebda77bba8ef86ca5e62"} Mar 08 03:38:53.115683 master-0 kubenswrapper[13046]: I0308 03:38:53.115634 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data" (OuterVolumeSpecName: "config-data") pod "77422f95-b335-44a2-a0f1-6bd0dcc99431" (UID: "77422f95-b335-44a2-a0f1-6bd0dcc99431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:38:53.133727 master-0 kubenswrapper[13046]: I0308 03:38:53.133673 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"c98ac0725684694acc42cbb1735351d2f5635f95e00fb9b8a56f40a4cc2c5d6e"} Mar 08 03:38:53.144501 master-0 kubenswrapper[13046]: I0308 03:38:53.140991 13046 scope.go:117] "RemoveContainer" containerID="f9e62e88529d946780a5432526af1e7d5b0baa4fe1e430920afdf01329579f0f" Mar 08 03:38:53.148510 master-0 kubenswrapper[13046]: I0308 03:38:53.146770 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"bd0e3137-0784-440c-abba-948535a56e3b","Type":"ContainerStarted","Data":"ee3554259c964b1887f351a89d716c599950076aeb4813c1983c7e8d6e4f8258"} Mar 08 03:38:53.168215 master-0 kubenswrapper[13046]: I0308 03:38:53.163866 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7859765cb5-ddvmn" event={"ID":"25cc77c3-42af-4d5e-bd93-a0d0c7e07092","Type":"ContainerStarted","Data":"3b8c5299ea1578665935913094985895a2d8c285e6a846a3d16693c0ad238a7b"} Mar 08 03:38:53.198447 master-0 kubenswrapper[13046]: I0308 03:38:53.194035 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2","Type":"ContainerStarted","Data":"80762aefc35cac70f16b0291161d22a68eb3e8ce808072c483076a056d88d74d"} Mar 08 03:38:53.208956 master-0 kubenswrapper[13046]: I0308 03:38:53.202938 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77422f95-b335-44a2-a0f1-6bd0dcc99431-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:38:53.560571 master-0 kubenswrapper[13046]: I0308 03:38:53.559237 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:53.714550 master-0 kubenswrapper[13046]: I0308 03:38:53.714150 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:38:53.799842 master-0 kubenswrapper[13046]: I0308 03:38:53.799769 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:53.827731 master-0 kubenswrapper[13046]: I0308 03:38:53.826546 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.883651 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: E0308 03:38:53.884186 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" containerName="mariadb-database-create" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884201 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" containerName="mariadb-database-create" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: E0308 03:38:53.884219 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a2178d-e786-49e5-995e-ac4b269e0089" containerName="mariadb-account-create-update" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884226 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a2178d-e786-49e5-995e-ac4b269e0089" containerName="mariadb-account-create-update" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: E0308 03:38:53.884243 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="probe" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884250 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="probe" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: E0308 03:38:53.884294 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="cinder-backup" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884300 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="cinder-backup" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884515 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="cinder-backup" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884552 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" containerName="mariadb-database-create" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884574 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a2178d-e786-49e5-995e-ac4b269e0089" containerName="mariadb-account-create-update" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.884587 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" containerName="probe" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.885757 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:53.888506 master-0 kubenswrapper[13046]: I0308 03:38:53.887712 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-e64dd-backup-config-data" Mar 08 03:38:53.955290 master-0 kubenswrapper[13046]: I0308 03:38:53.953543 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087554 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087604 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087654 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-run\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087701 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087739 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087786 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087825 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087853 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088114 master-0 kubenswrapper[13046]: I0308 03:38:54.087883 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088774 master-0 kubenswrapper[13046]: I0308 03:38:54.088410 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088774 master-0 kubenswrapper[13046]: I0308 03:38:54.088527 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088774 master-0 kubenswrapper[13046]: I0308 03:38:54.088585 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088867 master-0 kubenswrapper[13046]: I0308 03:38:54.088792 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.088867 master-0 kubenswrapper[13046]: I0308 03:38:54.088815 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.108938 master-0 kubenswrapper[13046]: I0308 03:38:54.088900 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726hz\" (UniqueName: \"kubernetes.io/projected/100689ad-dc43-494b-a7a2-f0351b969ab7-kube-api-access-726hz\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.193999 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726hz\" (UniqueName: \"kubernetes.io/projected/100689ad-dc43-494b-a7a2-f0351b969ab7-kube-api-access-726hz\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194062 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194078 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194108 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-run\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194134 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194158 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194183 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194207 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194225 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194246 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194319 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194352 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194377 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194453 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194469 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.194569 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-sys\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.195153 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-brick\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.195794 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-lib-modules\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.195852 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-run\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.195874 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-dev\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.197582 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-machine-id\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.197632 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-nvme\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.197672 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-locks-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.198262 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-combined-ca-bundle\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.198301 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-etc-iscsi\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.201799 master-0 kubenswrapper[13046]: I0308 03:38:54.198499 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/100689ad-dc43-494b-a7a2-f0351b969ab7-var-lib-cinder\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.214709 master-0 kubenswrapper[13046]: I0308 03:38:54.203599 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.214709 master-0 kubenswrapper[13046]: I0308 03:38:54.205128 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-scripts\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.214709 master-0 kubenswrapper[13046]: I0308 03:38:54.206138 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/100689ad-dc43-494b-a7a2-f0351b969ab7-config-data-custom\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.220727 master-0 kubenswrapper[13046]: I0308 03:38:54.220679 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77422f95-b335-44a2-a0f1-6bd0dcc99431" path="/var/lib/kubelet/pods/77422f95-b335-44a2-a0f1-6bd0dcc99431/volumes" Mar 08 03:38:54.237242 master-0 kubenswrapper[13046]: I0308 03:38:54.237203 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726hz\" (UniqueName: \"kubernetes.io/projected/100689ad-dc43-494b-a7a2-f0351b969ab7-kube-api-access-726hz\") pod \"cinder-e64dd-backup-0\" (UID: \"100689ad-dc43-494b-a7a2-f0351b969ab7\") " pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.250944 master-0 kubenswrapper[13046]: I0308 03:38:54.250886 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerStarted","Data":"9f41fc0704051a2b5ec348df05613f4048ae72c15772a5ce959d055908fb8a27"} Mar 08 03:38:54.255510 master-0 kubenswrapper[13046]: I0308 03:38:54.252229 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:54.255510 master-0 kubenswrapper[13046]: I0308 03:38:54.254254 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" event={"ID":"bd0e3137-0784-440c-abba-948535a56e3b","Type":"ContainerStarted","Data":"37243e67ce22143bf85efe7fd5f36c02074109921420cfc347de057bc6d8b8f7"} Mar 08 03:38:54.257753 master-0 kubenswrapper[13046]: I0308 03:38:54.257714 13046 generic.go:334] "Generic (PLEG): container finished" podID="25cc77c3-42af-4d5e-bd93-a0d0c7e07092" containerID="3b8c5299ea1578665935913094985895a2d8c285e6a846a3d16693c0ad238a7b" exitCode=0 Mar 08 03:38:54.257838 master-0 kubenswrapper[13046]: I0308 03:38:54.257764 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7859765cb5-ddvmn" event={"ID":"25cc77c3-42af-4d5e-bd93-a0d0c7e07092","Type":"ContainerDied","Data":"3b8c5299ea1578665935913094985895a2d8c285e6a846a3d16693c0ad238a7b"} Mar 08 03:38:54.257906 master-0 kubenswrapper[13046]: I0308 03:38:54.257836 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7859765cb5-ddvmn" event={"ID":"25cc77c3-42af-4d5e-bd93-a0d0c7e07092","Type":"ContainerStarted","Data":"67980c41a4e55419b99c32d8d887b2c68d531e47a0b3e0cc254df9400deafca3"} Mar 08 03:38:54.259384 master-0 kubenswrapper[13046]: I0308 03:38:54.259346 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-scheduler-0" event={"ID":"7e69d8ee-8a68-41e2-8af4-af4d00ec9af2","Type":"ContainerStarted","Data":"9e91ed8c37e221c2d6f16329d694d82598748c7e3d01019be90112c11e9986c7"} Mar 08 03:38:54.298805 master-0 kubenswrapper[13046]: I0308 03:38:54.298242 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:38:54.306055 master-0 kubenswrapper[13046]: I0308 03:38:54.306012 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:54.320570 master-0 kubenswrapper[13046]: I0308 03:38:54.320191 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-7b5cd79955-pgwfv" podStartSLOduration=6.28734563 podStartE2EDuration="9.320171247s" podCreationTimestamp="2026-03-08 03:38:45 +0000 UTC" firstStartedPulling="2026-03-08 03:38:47.726735196 +0000 UTC m=+1529.805502413" lastFinishedPulling="2026-03-08 03:38:50.759560823 +0000 UTC m=+1532.838328030" observedRunningTime="2026-03-08 03:38:54.305321615 +0000 UTC m=+1536.384088832" watchObservedRunningTime="2026-03-08 03:38:54.320171247 +0000 UTC m=+1536.398938464" Mar 08 03:38:54.335507 master-0 kubenswrapper[13046]: I0308 03:38:54.335007 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:54.412507 master-0 kubenswrapper[13046]: I0308 03:38:54.412359 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" podStartSLOduration=5.412341966 podStartE2EDuration="5.412341966s" podCreationTimestamp="2026-03-08 03:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:54.346739292 +0000 UTC m=+1536.425506509" watchObservedRunningTime="2026-03-08 03:38:54.412341966 +0000 UTC m=+1536.491109183" Mar 08 03:38:54.416508 master-0 kubenswrapper[13046]: I0308 03:38:54.414360 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-scheduler-0" podStartSLOduration=6.414352003 podStartE2EDuration="6.414352003s" podCreationTimestamp="2026-03-08 03:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:54.384826454 +0000 UTC m=+1536.463593671" watchObservedRunningTime="2026-03-08 03:38:54.414352003 +0000 UTC m=+1536.493119220" Mar 08 03:38:54.425885 master-0 kubenswrapper[13046]: I0308 03:38:54.425849 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-e64dd-api-0" Mar 08 03:38:54.644671 master-0 kubenswrapper[13046]: I0308 03:38:54.636187 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:38:55.155770 master-0 kubenswrapper[13046]: I0308 03:38:55.153559 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e64dd-backup-0"] Mar 08 03:38:55.291558 master-0 kubenswrapper[13046]: I0308 03:38:55.291251 13046 generic.go:334] "Generic (PLEG): container finished" podID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerID="9f41fc0704051a2b5ec348df05613f4048ae72c15772a5ce959d055908fb8a27" exitCode=1 Mar 08 03:38:55.291558 master-0 kubenswrapper[13046]: I0308 03:38:55.291331 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerDied","Data":"9f41fc0704051a2b5ec348df05613f4048ae72c15772a5ce959d055908fb8a27"} Mar 08 03:38:55.292889 master-0 kubenswrapper[13046]: I0308 03:38:55.292093 13046 scope.go:117] "RemoveContainer" containerID="9f41fc0704051a2b5ec348df05613f4048ae72c15772a5ce959d055908fb8a27" Mar 08 03:38:55.307025 master-0 kubenswrapper[13046]: I0308 03:38:55.306986 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7859765cb5-ddvmn" event={"ID":"25cc77c3-42af-4d5e-bd93-a0d0c7e07092","Type":"ContainerStarted","Data":"37229f284b89a60b40073eda9da447f782a1cbfced3719d51781da4d2d5c12cc"} Mar 08 03:38:55.307168 master-0 kubenswrapper[13046]: I0308 03:38:55.307155 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:38:55.328845 master-0 kubenswrapper[13046]: I0308 03:38:55.328799 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"100689ad-dc43-494b-a7a2-f0351b969ab7","Type":"ContainerStarted","Data":"114e84ff296ccb97352f5d53459ee0f87642b40e9dc9b6bb0302829d920c9308"} Mar 08 03:38:55.366662 master-0 kubenswrapper[13046]: I0308 03:38:55.366575 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-7859765cb5-ddvmn" podStartSLOduration=5.36655477 podStartE2EDuration="5.36655477s" podCreationTimestamp="2026-03-08 03:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:55.366137208 +0000 UTC m=+1537.444904425" watchObservedRunningTime="2026-03-08 03:38:55.36655477 +0000 UTC m=+1537.445321987" Mar 08 03:38:56.347248 master-0 kubenswrapper[13046]: I0308 03:38:56.345817 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"100689ad-dc43-494b-a7a2-f0351b969ab7","Type":"ContainerStarted","Data":"ab33c02be9bbd5fdd7c1866053ef09d2499e5b01fc60016e932e71c323b3d22f"} Mar 08 03:38:56.347248 master-0 kubenswrapper[13046]: I0308 03:38:56.345870 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e64dd-backup-0" event={"ID":"100689ad-dc43-494b-a7a2-f0351b969ab7","Type":"ContainerStarted","Data":"1d44ce7fbc90f8479a7bb136d4d26fbde237bfc4c90d21ebbce2241ea1d00d2f"} Mar 08 03:38:56.355846 master-0 kubenswrapper[13046]: I0308 03:38:56.355770 13046 generic.go:334] "Generic (PLEG): container finished" podID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerID="9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468" exitCode=1 Mar 08 03:38:56.356049 master-0 kubenswrapper[13046]: I0308 03:38:56.355853 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerDied","Data":"9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468"} Mar 08 03:38:56.356049 master-0 kubenswrapper[13046]: I0308 03:38:56.355903 13046 scope.go:117] "RemoveContainer" containerID="9f41fc0704051a2b5ec348df05613f4048ae72c15772a5ce959d055908fb8a27" Mar 08 03:38:56.357173 master-0 kubenswrapper[13046]: I0308 03:38:56.356697 13046 scope.go:117] "RemoveContainer" containerID="9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468" Mar 08 03:38:56.357173 master-0 kubenswrapper[13046]: E0308 03:38:56.356925 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7b5cd79955-pgwfv_openstack(ce85509f-e75e-477b-8797-4c405e53e3e3)\"" pod="openstack/ironic-7b5cd79955-pgwfv" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" Mar 08 03:38:56.362587 master-0 kubenswrapper[13046]: I0308 03:38:56.361429 13046 generic.go:334] "Generic (PLEG): container finished" podID="528b1064-a3b2-4ea4-8584-abeffdbedbbe" containerID="c98ac0725684694acc42cbb1735351d2f5635f95e00fb9b8a56f40a4cc2c5d6e" exitCode=0 Mar 08 03:38:56.362678 master-0 kubenswrapper[13046]: I0308 03:38:56.362600 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerDied","Data":"c98ac0725684694acc42cbb1735351d2f5635f95e00fb9b8a56f40a4cc2c5d6e"} Mar 08 03:38:56.411708 master-0 kubenswrapper[13046]: I0308 03:38:56.408957 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e64dd-backup-0" podStartSLOduration=3.408935248 podStartE2EDuration="3.408935248s" podCreationTimestamp="2026-03-08 03:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:38:56.379414589 +0000 UTC m=+1538.458181816" watchObservedRunningTime="2026-03-08 03:38:56.408935248 +0000 UTC m=+1538.487702455" Mar 08 03:38:56.419505 master-0 kubenswrapper[13046]: I0308 03:38:56.418566 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:56.419505 master-0 kubenswrapper[13046]: I0308 03:38:56.418643 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:38:56.572506 master-0 kubenswrapper[13046]: I0308 03:38:56.569829 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-db8dcf7d7-ct9xk" Mar 08 03:38:57.133094 master-0 kubenswrapper[13046]: I0308 03:38:57.133033 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:57.387513 master-0 kubenswrapper[13046]: I0308 03:38:57.387458 13046 scope.go:117] "RemoveContainer" containerID="9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468" Mar 08 03:38:57.388580 master-0 kubenswrapper[13046]: E0308 03:38:57.388556 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7b5cd79955-pgwfv_openstack(ce85509f-e75e-477b-8797-4c405e53e3e3)\"" pod="openstack/ironic-7b5cd79955-pgwfv" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" Mar 08 03:38:57.486204 master-0 kubenswrapper[13046]: I0308 03:38:57.486145 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b6ffc768d-hm56p" Mar 08 03:38:57.645704 master-0 kubenswrapper[13046]: I0308 03:38:57.639402 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:38:57.645704 master-0 kubenswrapper[13046]: I0308 03:38:57.640197 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7bff74b894-bgwhf" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-log" containerID="cri-o://8daaacbc9e146992705f48a1491ae2830c1e0864097b1f9b8700acd2cd0f8061" gracePeriod=30 Mar 08 03:38:57.645704 master-0 kubenswrapper[13046]: I0308 03:38:57.640611 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7bff74b894-bgwhf" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-api" containerID="cri-o://cc663a949dd9f36d6267702390a8d2a544cb5dcfd0ecc132d92141a48d2c6668" gracePeriod=30 Mar 08 03:38:57.786805 master-0 kubenswrapper[13046]: I0308 03:38:57.763542 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 03:38:57.786805 master-0 kubenswrapper[13046]: I0308 03:38:57.765083 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 03:38:57.789752 master-0 kubenswrapper[13046]: I0308 03:38:57.787565 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 03:38:57.789752 master-0 kubenswrapper[13046]: I0308 03:38:57.787793 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 03:38:57.815697 master-0 kubenswrapper[13046]: I0308 03:38:57.815095 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 03:38:57.893507 master-0 kubenswrapper[13046]: I0308 03:38:57.892448 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.893507 master-0 kubenswrapper[13046]: I0308 03:38:57.892535 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.893507 master-0 kubenswrapper[13046]: I0308 03:38:57.892556 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9ng\" (UniqueName: \"kubernetes.io/projected/2736a83d-7185-4aad-af7a-9b36d243400d-kube-api-access-5d9ng\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.893507 master-0 kubenswrapper[13046]: I0308 03:38:57.892662 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.997801 master-0 kubenswrapper[13046]: I0308 03:38:57.997627 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.997801 master-0 kubenswrapper[13046]: I0308 03:38:57.997775 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.997801 master-0 kubenswrapper[13046]: I0308 03:38:57.997801 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9ng\" (UniqueName: \"kubernetes.io/projected/2736a83d-7185-4aad-af7a-9b36d243400d-kube-api-access-5d9ng\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:57.998059 master-0 kubenswrapper[13046]: I0308 03:38:57.997964 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:58.006456 master-0 kubenswrapper[13046]: I0308 03:38:58.002219 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:58.006456 master-0 kubenswrapper[13046]: I0308 03:38:58.003869 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-openstack-config-secret\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:58.026508 master-0 kubenswrapper[13046]: I0308 03:38:58.021561 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2736a83d-7185-4aad-af7a-9b36d243400d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:58.030512 master-0 kubenswrapper[13046]: I0308 03:38:58.030252 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9ng\" (UniqueName: \"kubernetes.io/projected/2736a83d-7185-4aad-af7a-9b36d243400d-kube-api-access-5d9ng\") pod \"openstackclient\" (UID: \"2736a83d-7185-4aad-af7a-9b36d243400d\") " pod="openstack/openstackclient" Mar 08 03:38:58.144192 master-0 kubenswrapper[13046]: I0308 03:38:58.144138 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 03:38:58.478229 master-0 kubenswrapper[13046]: I0308 03:38:58.478122 13046 generic.go:334] "Generic (PLEG): container finished" podID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerID="8daaacbc9e146992705f48a1491ae2830c1e0864097b1f9b8700acd2cd0f8061" exitCode=143 Mar 08 03:38:58.479195 master-0 kubenswrapper[13046]: I0308 03:38:58.479167 13046 scope.go:117] "RemoveContainer" containerID="9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468" Mar 08 03:38:58.479493 master-0 kubenswrapper[13046]: E0308 03:38:58.479440 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7b5cd79955-pgwfv_openstack(ce85509f-e75e-477b-8797-4c405e53e3e3)\"" pod="openstack/ironic-7b5cd79955-pgwfv" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" Mar 08 03:38:58.479578 master-0 kubenswrapper[13046]: I0308 03:38:58.479509 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerDied","Data":"8daaacbc9e146992705f48a1491ae2830c1e0864097b1f9b8700acd2cd0f8061"} Mar 08 03:38:58.553027 master-0 kubenswrapper[13046]: I0308 03:38:58.552909 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:38:58.760826 master-0 kubenswrapper[13046]: I0308 03:38:58.756274 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 03:38:59.336116 master-0 kubenswrapper[13046]: I0308 03:38:59.336076 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:38:59.502352 master-0 kubenswrapper[13046]: I0308 03:38:59.501903 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2736a83d-7185-4aad-af7a-9b36d243400d","Type":"ContainerStarted","Data":"59c489a51d73e21162b73d4d951b50a023eeb7284cefecae096a417ed36911e5"} Mar 08 03:38:59.578238 master-0 kubenswrapper[13046]: I0308 03:38:59.578183 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-scheduler-0" Mar 08 03:38:59.890898 master-0 kubenswrapper[13046]: I0308 03:38:59.890834 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-volume-lvm-iscsi-0" Mar 08 03:39:00.415963 master-0 kubenswrapper[13046]: I0308 03:39:00.415913 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-rcvjv"] Mar 08 03:39:00.454605 master-0 kubenswrapper[13046]: I0308 03:39:00.419403 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.454605 master-0 kubenswrapper[13046]: I0308 03:39:00.424829 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 03:39:00.454605 master-0 kubenswrapper[13046]: I0308 03:39:00.425118 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 03:39:00.454605 master-0 kubenswrapper[13046]: I0308 03:39:00.436179 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-rcvjv"] Mar 08 03:39:00.531558 master-0 kubenswrapper[13046]: I0308 03:39:00.531496 13046 generic.go:334] "Generic (PLEG): container finished" podID="d093b57a-247f-4d76-8ad2-659f459f5f1a" containerID="4c2490c757bd4af990c59c840fc0cceb1cf1204f9198bcc6a46f24a8b703cd6d" exitCode=1 Mar 08 03:39:00.532035 master-0 kubenswrapper[13046]: I0308 03:39:00.531629 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerDied","Data":"4c2490c757bd4af990c59c840fc0cceb1cf1204f9198bcc6a46f24a8b703cd6d"} Mar 08 03:39:00.533326 master-0 kubenswrapper[13046]: I0308 03:39:00.533296 13046 scope.go:117] "RemoveContainer" containerID="4c2490c757bd4af990c59c840fc0cceb1cf1204f9198bcc6a46f24a8b703cd6d" Mar 08 03:39:00.538961 master-0 kubenswrapper[13046]: I0308 03:39:00.538903 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539053 master-0 kubenswrapper[13046]: I0308 03:39:00.538981 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539172 master-0 kubenswrapper[13046]: I0308 03:39:00.539143 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539219 master-0 kubenswrapper[13046]: I0308 03:39:00.539205 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539267 master-0 kubenswrapper[13046]: I0308 03:39:00.539241 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rldcz\" (UniqueName: \"kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539392 master-0 kubenswrapper[13046]: I0308 03:39:00.539359 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.539477 master-0 kubenswrapper[13046]: I0308 03:39:00.539456 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.642017 master-0 kubenswrapper[13046]: I0308 03:39:00.641981 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.642263 master-0 kubenswrapper[13046]: I0308 03:39:00.642243 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.642650 master-0 kubenswrapper[13046]: I0308 03:39:00.642633 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rldcz\" (UniqueName: \"kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.642824 master-0 kubenswrapper[13046]: I0308 03:39:00.642807 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.642971 master-0 kubenswrapper[13046]: I0308 03:39:00.642956 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.643089 master-0 kubenswrapper[13046]: I0308 03:39:00.643075 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.643179 master-0 kubenswrapper[13046]: I0308 03:39:00.643165 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.650813 master-0 kubenswrapper[13046]: I0308 03:39:00.650629 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.660534 master-0 kubenswrapper[13046]: I0308 03:39:00.658898 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.660534 master-0 kubenswrapper[13046]: I0308 03:39:00.660090 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.663000 master-0 kubenswrapper[13046]: I0308 03:39:00.662957 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.673718 master-0 kubenswrapper[13046]: I0308 03:39:00.673388 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.675418 master-0 kubenswrapper[13046]: I0308 03:39:00.675386 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rldcz\" (UniqueName: \"kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.689644 master-0 kubenswrapper[13046]: I0308 03:39:00.689269 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config\") pod \"ironic-inspector-db-sync-rcvjv\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.780747 master-0 kubenswrapper[13046]: I0308 03:39:00.780567 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:00.780747 master-0 kubenswrapper[13046]: I0308 03:39:00.780644 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:00.781242 master-0 kubenswrapper[13046]: I0308 03:39:00.781186 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:00.975434 master-0 kubenswrapper[13046]: I0308 03:39:00.974317 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:39:01.108576 master-0 kubenswrapper[13046]: I0308 03:39:01.102445 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:39:01.108576 master-0 kubenswrapper[13046]: I0308 03:39:01.102789 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="dnsmasq-dns" containerID="cri-o://a0e23c53745b817ec21b0155dce83f4beb43b47f0c72ddc93c1730ccb9295cff" gracePeriod=10 Mar 08 03:39:02.374119 master-0 kubenswrapper[13046]: I0308 03:39:02.374047 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d8564b49-5mpt2" Mar 08 03:39:02.482595 master-0 kubenswrapper[13046]: I0308 03:39:02.482459 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-7859765cb5-ddvmn" Mar 08 03:39:02.519265 master-0 kubenswrapper[13046]: I0308 03:39:02.519198 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:39:02.519605 master-0 kubenswrapper[13046]: I0308 03:39:02.519523 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67766cb894-9qmph" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-api" containerID="cri-o://4f9d048d579f237ee3fe87b15b98098d4dd4101762f67f7139a1ec5c7df66f1d" gracePeriod=30 Mar 08 03:39:02.519605 master-0 kubenswrapper[13046]: I0308 03:39:02.519569 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67766cb894-9qmph" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-httpd" containerID="cri-o://170e168489adeb31e1817b93124fb91b272d50f96b75a47b83a8bfefbe03502b" gracePeriod=30 Mar 08 03:39:02.753556 master-0 kubenswrapper[13046]: I0308 03:39:02.753317 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:39:02.754574 master-0 kubenswrapper[13046]: I0308 03:39:02.753850 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-7b5cd79955-pgwfv" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api-log" containerID="cri-o://a5b6711eff6dd2e62e4f814e13dab71a094483239358ebda77bba8ef86ca5e62" gracePeriod=60 Mar 08 03:39:03.793198 master-0 kubenswrapper[13046]: I0308 03:39:03.782772 13046 generic.go:334] "Generic (PLEG): container finished" podID="f3134df2-c86a-46bb-89ca-3598293b4695" containerID="170e168489adeb31e1817b93124fb91b272d50f96b75a47b83a8bfefbe03502b" exitCode=0 Mar 08 03:39:03.793198 master-0 kubenswrapper[13046]: I0308 03:39:03.782869 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerDied","Data":"170e168489adeb31e1817b93124fb91b272d50f96b75a47b83a8bfefbe03502b"} Mar 08 03:39:03.799717 master-0 kubenswrapper[13046]: I0308 03:39:03.799622 13046 generic.go:334] "Generic (PLEG): container finished" podID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerID="a5b6711eff6dd2e62e4f814e13dab71a094483239358ebda77bba8ef86ca5e62" exitCode=143 Mar 08 03:39:03.799793 master-0 kubenswrapper[13046]: I0308 03:39:03.799752 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerDied","Data":"a5b6711eff6dd2e62e4f814e13dab71a094483239358ebda77bba8ef86ca5e62"} Mar 08 03:39:03.804031 master-0 kubenswrapper[13046]: I0308 03:39:03.803983 13046 generic.go:334] "Generic (PLEG): container finished" podID="93dab66c-116f-4bae-8331-aad21f0e3232" containerID="a0e23c53745b817ec21b0155dce83f4beb43b47f0c72ddc93c1730ccb9295cff" exitCode=0 Mar 08 03:39:03.804100 master-0 kubenswrapper[13046]: I0308 03:39:03.804064 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" event={"ID":"93dab66c-116f-4bae-8331-aad21f0e3232","Type":"ContainerDied","Data":"a0e23c53745b817ec21b0155dce83f4beb43b47f0c72ddc93c1730ccb9295cff"} Mar 08 03:39:03.808895 master-0 kubenswrapper[13046]: I0308 03:39:03.807126 13046 generic.go:334] "Generic (PLEG): container finished" podID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerID="cc663a949dd9f36d6267702390a8d2a544cb5dcfd0ecc132d92141a48d2c6668" exitCode=0 Mar 08 03:39:03.808895 master-0 kubenswrapper[13046]: I0308 03:39:03.807181 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerDied","Data":"cc663a949dd9f36d6267702390a8d2a544cb5dcfd0ecc132d92141a48d2c6668"} Mar 08 03:39:04.433729 master-0 kubenswrapper[13046]: I0308 03:39:04.433671 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:39:04.477220 master-0 kubenswrapper[13046]: I0308 03:39:04.476988 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-rcvjv"] Mar 08 03:39:04.513728 master-0 kubenswrapper[13046]: I0308 03:39:04.513688 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:39:04.542619 master-0 kubenswrapper[13046]: I0308 03:39:04.542432 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.602659 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.603035 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.603106 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.603216 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.603353 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m64bq\" (UniqueName: \"kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.606369 master-0 kubenswrapper[13046]: I0308 03:39:04.603706 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0\") pod \"93dab66c-116f-4bae-8331-aad21f0e3232\" (UID: \"93dab66c-116f-4bae-8331-aad21f0e3232\") " Mar 08 03:39:04.623345 master-0 kubenswrapper[13046]: I0308 03:39:04.620111 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq" (OuterVolumeSpecName: "kube-api-access-m64bq") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "kube-api-access-m64bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:04.676941 master-0 kubenswrapper[13046]: I0308 03:39:04.676886 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:04.681527 master-0 kubenswrapper[13046]: I0308 03:39:04.681309 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:04.714746 master-0 kubenswrapper[13046]: I0308 03:39:04.713899 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.714746 master-0 kubenswrapper[13046]: I0308 03:39:04.713994 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.714746 master-0 kubenswrapper[13046]: I0308 03:39:04.714094 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.714746 master-0 kubenswrapper[13046]: I0308 03:39:04.714722 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs" (OuterVolumeSpecName: "logs") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.722754 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.722909 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.722937 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5n9f\" (UniqueName: \"kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.722963 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.723059 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723094 master-0 kubenswrapper[13046]: I0308 03:39:04.723098 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723156 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723209 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723226 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723258 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w9f4\" (UniqueName: \"kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723298 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom\") pod \"ce85509f-e75e-477b-8797-4c405e53e3e3\" (UID: \"ce85509f-e75e-477b-8797-4c405e53e3e3\") " Mar 08 03:39:04.723430 master-0 kubenswrapper[13046]: I0308 03:39:04.723329 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data\") pod \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\" (UID: \"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c\") " Mar 08 03:39:04.728515 master-0 kubenswrapper[13046]: I0308 03:39:04.728193 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs" (OuterVolumeSpecName: "logs") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:04.729097 master-0 kubenswrapper[13046]: I0308 03:39:04.729059 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m64bq\" (UniqueName: \"kubernetes.io/projected/93dab66c-116f-4bae-8331-aad21f0e3232-kube-api-access-m64bq\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.729154 master-0 kubenswrapper[13046]: I0308 03:39:04.729099 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.729154 master-0 kubenswrapper[13046]: I0308 03:39:04.729112 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.729154 master-0 kubenswrapper[13046]: I0308 03:39:04.729122 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.729154 master-0 kubenswrapper[13046]: I0308 03:39:04.729131 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.732471 master-0 kubenswrapper[13046]: I0308 03:39:04.731973 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:04.732471 master-0 kubenswrapper[13046]: I0308 03:39:04.732311 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:04.745322 master-0 kubenswrapper[13046]: I0308 03:39:04.737764 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f" (OuterVolumeSpecName: "kube-api-access-x5n9f") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "kube-api-access-x5n9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:04.747905 master-0 kubenswrapper[13046]: I0308 03:39:04.747846 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-e64dd-backup-0" Mar 08 03:39:04.752674 master-0 kubenswrapper[13046]: I0308 03:39:04.752603 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.756842 master-0 kubenswrapper[13046]: I0308 03:39:04.756704 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 03:39:04.759170 master-0 kubenswrapper[13046]: I0308 03:39:04.757939 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts" (OuterVolumeSpecName: "scripts") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.772498 master-0 kubenswrapper[13046]: I0308 03:39:04.765635 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts" (OuterVolumeSpecName: "scripts") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.772498 master-0 kubenswrapper[13046]: I0308 03:39:04.767680 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config" (OuterVolumeSpecName: "config") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:04.783717 master-0 kubenswrapper[13046]: I0308 03:39:04.779709 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4" (OuterVolumeSpecName: "kube-api-access-8w9f4") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "kube-api-access-8w9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:04.805504 master-0 kubenswrapper[13046]: I0308 03:39:04.804639 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93dab66c-116f-4bae-8331-aad21f0e3232" (UID: "93dab66c-116f-4bae-8331-aad21f0e3232"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.831693 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data" (OuterVolumeSpecName: "config-data") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832095 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832121 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832131 13046 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ce85509f-e75e-477b-8797-4c405e53e3e3-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832140 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832151 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w9f4\" (UniqueName: \"kubernetes.io/projected/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-kube-api-access-8w9f4\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832160 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832170 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832203 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832212 13046 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ce85509f-e75e-477b-8797-4c405e53e3e3-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832221 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5n9f\" (UniqueName: \"kubernetes.io/projected/ce85509f-e75e-477b-8797-4c405e53e3e3-kube-api-access-x5n9f\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.833544 master-0 kubenswrapper[13046]: I0308 03:39:04.832230 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93dab66c-116f-4bae-8331-aad21f0e3232-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.837056 master-0 kubenswrapper[13046]: I0308 03:39:04.836105 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce85509f-e75e-477b-8797-4c405e53e3e3" (UID: "ce85509f-e75e-477b-8797-4c405e53e3e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.854349 master-0 kubenswrapper[13046]: I0308 03:39:04.853456 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.859430 master-0 kubenswrapper[13046]: I0308 03:39:04.857369 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bff74b894-bgwhf" event={"ID":"1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c","Type":"ContainerDied","Data":"0a43a375968f9a9908aa22ec6c9fbc2395fa319d73137232ba8081851785c5f1"} Mar 08 03:39:04.859430 master-0 kubenswrapper[13046]: I0308 03:39:04.857428 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bff74b894-bgwhf" Mar 08 03:39:04.859430 master-0 kubenswrapper[13046]: I0308 03:39:04.857437 13046 scope.go:117] "RemoveContainer" containerID="cc663a949dd9f36d6267702390a8d2a544cb5dcfd0ecc132d92141a48d2c6668" Mar 08 03:39:04.869073 master-0 kubenswrapper[13046]: I0308 03:39:04.861639 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rcvjv" event={"ID":"327283ac-a839-484a-a4aa-daf30c72b9f4","Type":"ContainerStarted","Data":"edc2c1f7706e71a434e13c55067567281ca679bdda51f07085c9a3275192c957"} Mar 08 03:39:04.869073 master-0 kubenswrapper[13046]: I0308 03:39:04.868274 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7b5cd79955-pgwfv" event={"ID":"ce85509f-e75e-477b-8797-4c405e53e3e3","Type":"ContainerDied","Data":"c1c20091a85971b691c2a3c64590385fb4d5e3b6312b9c304e791eeac6867806"} Mar 08 03:39:04.869073 master-0 kubenswrapper[13046]: I0308 03:39:04.868324 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7b5cd79955-pgwfv" Mar 08 03:39:04.877655 master-0 kubenswrapper[13046]: I0308 03:39:04.877580 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" Mar 08 03:39:04.878394 master-0 kubenswrapper[13046]: I0308 03:39:04.878363 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fbcc86b7-tfsvc" event={"ID":"93dab66c-116f-4bae-8331-aad21f0e3232","Type":"ContainerDied","Data":"96fac1dd612f94e3e245fbe9f6ea6c81f8f9a1375756073cf827277782922d7f"} Mar 08 03:39:04.882120 master-0 kubenswrapper[13046]: I0308 03:39:04.882070 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerStarted","Data":"f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20"} Mar 08 03:39:04.882632 master-0 kubenswrapper[13046]: I0308 03:39:04.882518 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:04.921755 master-0 kubenswrapper[13046]: I0308 03:39:04.921526 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data" (OuterVolumeSpecName: "config-data") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:04.942523 master-0 kubenswrapper[13046]: I0308 03:39:04.942460 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.942523 master-0 kubenswrapper[13046]: I0308 03:39:04.942514 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce85509f-e75e-477b-8797-4c405e53e3e3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.942523 master-0 kubenswrapper[13046]: I0308 03:39:04.942525 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:04.958313 master-0 kubenswrapper[13046]: I0308 03:39:04.958234 13046 scope.go:117] "RemoveContainer" containerID="8daaacbc9e146992705f48a1491ae2830c1e0864097b1f9b8700acd2cd0f8061" Mar 08 03:39:04.999391 master-0 kubenswrapper[13046]: I0308 03:39:04.999303 13046 scope.go:117] "RemoveContainer" containerID="9d6902e5534cc1b542633da8bb61b9430167151900890f311eb4c0dec1a71468" Mar 08 03:39:05.002344 master-0 kubenswrapper[13046]: I0308 03:39:05.002302 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:05.014399 master-0 kubenswrapper[13046]: I0308 03:39:05.013763 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" (UID: "1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:05.044470 master-0 kubenswrapper[13046]: I0308 03:39:05.044415 13046 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:05.044470 master-0 kubenswrapper[13046]: I0308 03:39:05.044462 13046 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:05.050399 master-0 kubenswrapper[13046]: I0308 03:39:05.050339 13046 scope.go:117] "RemoveContainer" containerID="a5b6711eff6dd2e62e4f814e13dab71a094483239358ebda77bba8ef86ca5e62" Mar 08 03:39:05.072768 master-0 kubenswrapper[13046]: I0308 03:39:05.072556 13046 scope.go:117] "RemoveContainer" containerID="2b0f1164d47af747ce565f753aebedb8270b64d5394e72ef1a65378ef3976cbf" Mar 08 03:39:05.148842 master-0 kubenswrapper[13046]: I0308 03:39:05.139778 13046 scope.go:117] "RemoveContainer" containerID="a0e23c53745b817ec21b0155dce83f4beb43b47f0c72ddc93c1730ccb9295cff" Mar 08 03:39:05.249700 master-0 kubenswrapper[13046]: I0308 03:39:05.249649 13046 scope.go:117] "RemoveContainer" containerID="755e79d4c0de53579ed2b528e4569e56f490113a2394552739a16926fb9a4f93" Mar 08 03:39:05.406581 master-0 kubenswrapper[13046]: I0308 03:39:05.405513 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:39:05.447578 master-0 kubenswrapper[13046]: I0308 03:39:05.447407 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59fbcc86b7-tfsvc"] Mar 08 03:39:05.459663 master-0 kubenswrapper[13046]: I0308 03:39:05.459609 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:39:05.470697 master-0 kubenswrapper[13046]: I0308 03:39:05.470637 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-7b5cd79955-pgwfv"] Mar 08 03:39:05.482061 master-0 kubenswrapper[13046]: I0308 03:39:05.481984 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:39:05.492796 master-0 kubenswrapper[13046]: I0308 03:39:05.492734 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7bff74b894-bgwhf"] Mar 08 03:39:06.149994 master-0 kubenswrapper[13046]: I0308 03:39:06.149921 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" path="/var/lib/kubelet/pods/1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c/volumes" Mar 08 03:39:06.152529 master-0 kubenswrapper[13046]: I0308 03:39:06.152500 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" path="/var/lib/kubelet/pods/93dab66c-116f-4bae-8331-aad21f0e3232/volumes" Mar 08 03:39:06.153345 master-0 kubenswrapper[13046]: I0308 03:39:06.153322 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" path="/var/lib/kubelet/pods/ce85509f-e75e-477b-8797-4c405e53e3e3/volumes" Mar 08 03:39:06.194217 master-0 kubenswrapper[13046]: I0308 03:39:06.194146 13046 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6f99839b-bfe0-4190-824a-67227752928b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6f99839b-bfe0-4190-824a-67227752928b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6f99839b_bfe0_4190_824a_67227752928b.slice" Mar 08 03:39:06.194217 master-0 kubenswrapper[13046]: E0308 03:39:06.194218 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6f99839b-bfe0-4190-824a-67227752928b] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6f99839b-bfe0-4190-824a-67227752928b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6f99839b_bfe0_4190_824a_67227752928b.slice" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" podUID="6f99839b-bfe0-4190-824a-67227752928b" Mar 08 03:39:06.389887 master-0 kubenswrapper[13046]: I0308 03:39:06.389636 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6x7dc"] Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390197 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390217 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390233 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390240 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390259 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390265 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390297 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390303 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390323 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="dnsmasq-dns" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390329 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="dnsmasq-dns" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390340 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="init" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390345 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="init" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: E0308 03:39:06.390367 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="init" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390372 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="init" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390581 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390603 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390622 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api-log" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390642 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a578f6f-9e06-42a0-b8c9-61f05d7a5c0c" containerName="placement-api" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.390658 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="93dab66c-116f-4bae-8331-aad21f0e3232" containerName="dnsmasq-dns" Mar 08 03:39:06.395326 master-0 kubenswrapper[13046]: I0308 03:39:06.391383 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.421580 master-0 kubenswrapper[13046]: I0308 03:39:06.418667 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6x7dc"] Mar 08 03:39:06.491506 master-0 kubenswrapper[13046]: I0308 03:39:06.489411 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pdtcb"] Mar 08 03:39:06.491506 master-0 kubenswrapper[13046]: E0308 03:39:06.490002 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.491506 master-0 kubenswrapper[13046]: I0308 03:39:06.490017 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.491506 master-0 kubenswrapper[13046]: I0308 03:39:06.490752 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce85509f-e75e-477b-8797-4c405e53e3e3" containerName="ironic-api" Mar 08 03:39:06.491852 master-0 kubenswrapper[13046]: I0308 03:39:06.491829 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.517377 master-0 kubenswrapper[13046]: I0308 03:39:06.517316 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.517677 master-0 kubenswrapper[13046]: I0308 03:39:06.517653 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72vc\" (UniqueName: \"kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.518249 master-0 kubenswrapper[13046]: I0308 03:39:06.518207 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pdtcb"] Mar 08 03:39:06.624021 master-0 kubenswrapper[13046]: I0308 03:39:06.620422 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.624021 master-0 kubenswrapper[13046]: I0308 03:39:06.620545 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.624021 master-0 kubenswrapper[13046]: I0308 03:39:06.620683 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lc88\" (UniqueName: \"kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.624021 master-0 kubenswrapper[13046]: I0308 03:39:06.620717 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72vc\" (UniqueName: \"kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.624021 master-0 kubenswrapper[13046]: I0308 03:39:06.621715 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.652441 master-0 kubenswrapper[13046]: I0308 03:39:06.652345 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72vc\" (UniqueName: \"kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc\") pod \"nova-api-db-create-6x7dc\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.656810 master-0 kubenswrapper[13046]: I0308 03:39:06.656765 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xds8t"] Mar 08 03:39:06.659296 master-0 kubenswrapper[13046]: I0308 03:39:06.659255 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.702723 master-0 kubenswrapper[13046]: I0308 03:39:06.702419 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-65fa-account-create-update-rktss"] Mar 08 03:39:06.705434 master-0 kubenswrapper[13046]: I0308 03:39:06.705406 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.709021 master-0 kubenswrapper[13046]: I0308 03:39:06.708986 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 03:39:06.722518 master-0 kubenswrapper[13046]: I0308 03:39:06.722428 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.722632 master-0 kubenswrapper[13046]: I0308 03:39:06.722609 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lc88\" (UniqueName: \"kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.722715 master-0 kubenswrapper[13046]: I0308 03:39:06.722696 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.722783 master-0 kubenswrapper[13046]: I0308 03:39:06.722720 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8zfh\" (UniqueName: \"kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.724514 master-0 kubenswrapper[13046]: I0308 03:39:06.724356 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.731968 master-0 kubenswrapper[13046]: I0308 03:39:06.731929 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xds8t"] Mar 08 03:39:06.733913 master-0 kubenswrapper[13046]: I0308 03:39:06.733881 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:06.761526 master-0 kubenswrapper[13046]: I0308 03:39:06.760195 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-65fa-account-create-update-rktss"] Mar 08 03:39:06.775641 master-0 kubenswrapper[13046]: I0308 03:39:06.775496 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lc88\" (UniqueName: \"kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88\") pod \"nova-cell0-db-create-pdtcb\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.826506 master-0 kubenswrapper[13046]: I0308 03:39:06.824976 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.826506 master-0 kubenswrapper[13046]: I0308 03:39:06.825096 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmvc\" (UniqueName: \"kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.826506 master-0 kubenswrapper[13046]: I0308 03:39:06.825281 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.826506 master-0 kubenswrapper[13046]: I0308 03:39:06.825341 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8zfh\" (UniqueName: \"kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.826506 master-0 kubenswrapper[13046]: I0308 03:39:06.826387 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.841256 master-0 kubenswrapper[13046]: I0308 03:39:06.841198 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2420-account-create-update-mb88p"] Mar 08 03:39:06.846028 master-0 kubenswrapper[13046]: I0308 03:39:06.845124 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:06.862252 master-0 kubenswrapper[13046]: I0308 03:39:06.862098 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2420-account-create-update-mb88p"] Mar 08 03:39:06.862534 master-0 kubenswrapper[13046]: I0308 03:39:06.862494 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:06.864937 master-0 kubenswrapper[13046]: I0308 03:39:06.864914 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 03:39:06.867413 master-0 kubenswrapper[13046]: I0308 03:39:06.867386 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8zfh\" (UniqueName: \"kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh\") pod \"nova-cell1-db-create-xds8t\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:06.928700 master-0 kubenswrapper[13046]: I0308 03:39:06.928653 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:06.932694 master-0 kubenswrapper[13046]: I0308 03:39:06.928716 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.932694 master-0 kubenswrapper[13046]: I0308 03:39:06.928800 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448bw\" (UniqueName: \"kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:06.932694 master-0 kubenswrapper[13046]: I0308 03:39:06.928849 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmvc\" (UniqueName: \"kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.932694 master-0 kubenswrapper[13046]: I0308 03:39:06.929362 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.976849 master-0 kubenswrapper[13046]: I0308 03:39:06.970428 13046 generic.go:334] "Generic (PLEG): container finished" podID="f3134df2-c86a-46bb-89ca-3598293b4695" containerID="4f9d048d579f237ee3fe87b15b98098d4dd4101762f67f7139a1ec5c7df66f1d" exitCode=0 Mar 08 03:39:06.976849 master-0 kubenswrapper[13046]: I0308 03:39:06.970541 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f86777f9-xlvvs" Mar 08 03:39:06.976849 master-0 kubenswrapper[13046]: I0308 03:39:06.972050 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerDied","Data":"4f9d048d579f237ee3fe87b15b98098d4dd4101762f67f7139a1ec5c7df66f1d"} Mar 08 03:39:06.976849 master-0 kubenswrapper[13046]: I0308 03:39:06.973576 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmvc\" (UniqueName: \"kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc\") pod \"nova-api-65fa-account-create-update-rktss\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:06.984227 master-0 kubenswrapper[13046]: I0308 03:39:06.984165 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4cfc-account-create-update-sf2rb"] Mar 08 03:39:07.005139 master-0 kubenswrapper[13046]: I0308 03:39:06.991998 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.005139 master-0 kubenswrapper[13046]: I0308 03:39:06.994564 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 03:39:07.005139 master-0 kubenswrapper[13046]: I0308 03:39:07.000402 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:07.017413 master-0 kubenswrapper[13046]: I0308 03:39:07.016656 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4cfc-account-create-update-sf2rb"] Mar 08 03:39:07.031177 master-0 kubenswrapper[13046]: I0308 03:39:07.026369 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:07.033173 master-0 kubenswrapper[13046]: I0308 03:39:07.033130 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svhtn\" (UniqueName: \"kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.033262 master-0 kubenswrapper[13046]: I0308 03:39:07.033210 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:07.033988 master-0 kubenswrapper[13046]: I0308 03:39:07.033957 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:07.037753 master-0 kubenswrapper[13046]: I0308 03:39:07.037704 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448bw\" (UniqueName: \"kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:07.037829 master-0 kubenswrapper[13046]: I0308 03:39:07.037786 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.054325 master-0 kubenswrapper[13046]: I0308 03:39:07.054292 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448bw\" (UniqueName: \"kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw\") pod \"nova-cell0-2420-account-create-update-mb88p\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:07.063677 master-0 kubenswrapper[13046]: I0308 03:39:07.063607 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:39:07.073643 master-0 kubenswrapper[13046]: I0308 03:39:07.072685 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9f86777f9-xlvvs"] Mar 08 03:39:07.141351 master-0 kubenswrapper[13046]: I0308 03:39:07.140291 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.141351 master-0 kubenswrapper[13046]: I0308 03:39:07.140473 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svhtn\" (UniqueName: \"kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.148710 master-0 kubenswrapper[13046]: I0308 03:39:07.147404 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.160976 master-0 kubenswrapper[13046]: I0308 03:39:07.160943 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svhtn\" (UniqueName: \"kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn\") pod \"nova-cell1-4cfc-account-create-update-sf2rb\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.218431 master-0 kubenswrapper[13046]: I0308 03:39:07.218309 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:07.336537 master-0 kubenswrapper[13046]: I0308 03:39:07.335972 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:07.715704 master-0 kubenswrapper[13046]: I0308 03:39:07.715589 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-765dd9b47b-xzxwf"] Mar 08 03:39:07.717575 master-0 kubenswrapper[13046]: I0308 03:39:07.717549 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.720647 master-0 kubenswrapper[13046]: I0308 03:39:07.720613 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 03:39:07.731466 master-0 kubenswrapper[13046]: I0308 03:39:07.730361 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 03:39:07.731996 master-0 kubenswrapper[13046]: I0308 03:39:07.730361 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 03:39:07.761592 master-0 kubenswrapper[13046]: I0308 03:39:07.759535 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-765dd9b47b-xzxwf"] Mar 08 03:39:07.790513 master-0 kubenswrapper[13046]: I0308 03:39:07.790445 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-public-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.790729 master-0 kubenswrapper[13046]: I0308 03:39:07.790534 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmdg\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-kube-api-access-tvmdg\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.791700 master-0 kubenswrapper[13046]: I0308 03:39:07.791663 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-run-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.791858 master-0 kubenswrapper[13046]: I0308 03:39:07.791838 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-log-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.792140 master-0 kubenswrapper[13046]: I0308 03:39:07.792119 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-combined-ca-bundle\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.792293 master-0 kubenswrapper[13046]: I0308 03:39:07.792274 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-config-data\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.792552 master-0 kubenswrapper[13046]: I0308 03:39:07.792539 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-etc-swift\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.792666 master-0 kubenswrapper[13046]: I0308 03:39:07.792652 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-internal-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896746 master-0 kubenswrapper[13046]: I0308 03:39:07.896617 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-public-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896746 master-0 kubenswrapper[13046]: I0308 03:39:07.896677 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmdg\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-kube-api-access-tvmdg\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896963 master-0 kubenswrapper[13046]: I0308 03:39:07.896759 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-run-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896963 master-0 kubenswrapper[13046]: I0308 03:39:07.896783 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-log-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896963 master-0 kubenswrapper[13046]: I0308 03:39:07.896835 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-combined-ca-bundle\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896963 master-0 kubenswrapper[13046]: I0308 03:39:07.896894 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-config-data\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.896963 master-0 kubenswrapper[13046]: I0308 03:39:07.896953 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-etc-swift\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.897122 master-0 kubenswrapper[13046]: I0308 03:39:07.896972 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-internal-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.899239 master-0 kubenswrapper[13046]: I0308 03:39:07.897841 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-run-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.900396 master-0 kubenswrapper[13046]: I0308 03:39:07.899999 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6603220-12c5-4879-a72d-f12a27e7ed84-log-httpd\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.901961 master-0 kubenswrapper[13046]: I0308 03:39:07.901908 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-internal-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.903671 master-0 kubenswrapper[13046]: I0308 03:39:07.903626 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-config-data\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.904996 master-0 kubenswrapper[13046]: I0308 03:39:07.904960 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-public-tls-certs\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.915364 master-0 kubenswrapper[13046]: I0308 03:39:07.915293 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-etc-swift\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.917194 master-0 kubenswrapper[13046]: I0308 03:39:07.917157 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6603220-12c5-4879-a72d-f12a27e7ed84-combined-ca-bundle\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:07.918437 master-0 kubenswrapper[13046]: I0308 03:39:07.918400 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmdg\" (UniqueName: \"kubernetes.io/projected/d6603220-12c5-4879-a72d-f12a27e7ed84-kube-api-access-tvmdg\") pod \"swift-proxy-765dd9b47b-xzxwf\" (UID: \"d6603220-12c5-4879-a72d-f12a27e7ed84\") " pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:08.063630 master-0 kubenswrapper[13046]: I0308 03:39:08.063428 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:08.132038 master-0 kubenswrapper[13046]: I0308 03:39:08.131802 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f99839b-bfe0-4190-824a-67227752928b" path="/var/lib/kubelet/pods/6f99839b-bfe0-4190-824a-67227752928b/volumes" Mar 08 03:39:08.152825 master-0 kubenswrapper[13046]: I0308 03:39:08.152672 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:39:08.206354 master-0 kubenswrapper[13046]: I0308 03:39:08.206251 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6p8h\" (UniqueName: \"kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h\") pod \"f3134df2-c86a-46bb-89ca-3598293b4695\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " Mar 08 03:39:08.206884 master-0 kubenswrapper[13046]: I0308 03:39:08.206373 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs\") pod \"f3134df2-c86a-46bb-89ca-3598293b4695\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " Mar 08 03:39:08.206884 master-0 kubenswrapper[13046]: I0308 03:39:08.206570 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config\") pod \"f3134df2-c86a-46bb-89ca-3598293b4695\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " Mar 08 03:39:08.206884 master-0 kubenswrapper[13046]: I0308 03:39:08.206678 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config\") pod \"f3134df2-c86a-46bb-89ca-3598293b4695\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " Mar 08 03:39:08.206884 master-0 kubenswrapper[13046]: I0308 03:39:08.206728 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle\") pod \"f3134df2-c86a-46bb-89ca-3598293b4695\" (UID: \"f3134df2-c86a-46bb-89ca-3598293b4695\") " Mar 08 03:39:08.254854 master-0 kubenswrapper[13046]: I0308 03:39:08.254798 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h" (OuterVolumeSpecName: "kube-api-access-g6p8h") pod "f3134df2-c86a-46bb-89ca-3598293b4695" (UID: "f3134df2-c86a-46bb-89ca-3598293b4695"). InnerVolumeSpecName "kube-api-access-g6p8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:08.256702 master-0 kubenswrapper[13046]: I0308 03:39:08.256637 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f3134df2-c86a-46bb-89ca-3598293b4695" (UID: "f3134df2-c86a-46bb-89ca-3598293b4695"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:08.288739 master-0 kubenswrapper[13046]: I0308 03:39:08.288689 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config" (OuterVolumeSpecName: "config") pod "f3134df2-c86a-46bb-89ca-3598293b4695" (UID: "f3134df2-c86a-46bb-89ca-3598293b4695"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:08.314551 master-0 kubenswrapper[13046]: I0308 03:39:08.314503 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:08.314551 master-0 kubenswrapper[13046]: I0308 03:39:08.314540 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6p8h\" (UniqueName: \"kubernetes.io/projected/f3134df2-c86a-46bb-89ca-3598293b4695-kube-api-access-g6p8h\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:08.314551 master-0 kubenswrapper[13046]: I0308 03:39:08.314556 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:08.329191 master-0 kubenswrapper[13046]: I0308 03:39:08.329138 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f3134df2-c86a-46bb-89ca-3598293b4695" (UID: "f3134df2-c86a-46bb-89ca-3598293b4695"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:08.335909 master-0 kubenswrapper[13046]: I0308 03:39:08.335686 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3134df2-c86a-46bb-89ca-3598293b4695" (UID: "f3134df2-c86a-46bb-89ca-3598293b4695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:08.420040 master-0 kubenswrapper[13046]: I0308 03:39:08.419759 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:08.420040 master-0 kubenswrapper[13046]: I0308 03:39:08.419827 13046 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3134df2-c86a-46bb-89ca-3598293b4695-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:08.711656 master-0 kubenswrapper[13046]: I0308 03:39:08.711603 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6x7dc"] Mar 08 03:39:08.737759 master-0 kubenswrapper[13046]: W0308 03:39:08.737672 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb81925e_2987_49d2_a511_33a9f40ddd8c.slice/crio-a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1 WatchSource:0}: Error finding container a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1: Status 404 returned error can't find the container with id a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1 Mar 08 03:39:08.958119 master-0 kubenswrapper[13046]: I0308 03:39:08.958085 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pdtcb"] Mar 08 03:39:08.965581 master-0 kubenswrapper[13046]: W0308 03:39:08.964847 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f9dfc00_d4d2_454d_b76f_076660d4f9e2.slice/crio-ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d WatchSource:0}: Error finding container ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d: Status 404 returned error can't find the container with id ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d Mar 08 03:39:08.990416 master-0 kubenswrapper[13046]: I0308 03:39:08.990383 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-65fa-account-create-update-rktss"] Mar 08 03:39:09.003927 master-0 kubenswrapper[13046]: I0308 03:39:09.003873 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xds8t"] Mar 08 03:39:09.047757 master-0 kubenswrapper[13046]: I0308 03:39:09.047677 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xds8t" event={"ID":"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682","Type":"ContainerStarted","Data":"f8b5c9080525db5c91440b55c8914e964a36e677e958e0f0e4481fb016a81eeb"} Mar 08 03:39:09.094777 master-0 kubenswrapper[13046]: I0308 03:39:09.088160 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rcvjv" event={"ID":"327283ac-a839-484a-a4aa-daf30c72b9f4","Type":"ContainerStarted","Data":"0cc79d5e6a4a278bdfa5a571e1ccea4389592109042ae880cffd0667a3d11640"} Mar 08 03:39:09.113532 master-0 kubenswrapper[13046]: I0308 03:39:09.110993 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67766cb894-9qmph" event={"ID":"f3134df2-c86a-46bb-89ca-3598293b4695","Type":"ContainerDied","Data":"5c1f647c671a31c7cc2d2b24735c4a6e2196d11285baf266068e423673d00c8f"} Mar 08 03:39:09.113532 master-0 kubenswrapper[13046]: I0308 03:39:09.111065 13046 scope.go:117] "RemoveContainer" containerID="170e168489adeb31e1817b93124fb91b272d50f96b75a47b83a8bfefbe03502b" Mar 08 03:39:09.113532 master-0 kubenswrapper[13046]: I0308 03:39:09.111213 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67766cb894-9qmph" Mar 08 03:39:09.121693 master-0 kubenswrapper[13046]: I0308 03:39:09.121220 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-rcvjv" podStartSLOduration=5.337704117 podStartE2EDuration="9.121204374s" podCreationTimestamp="2026-03-08 03:39:00 +0000 UTC" firstStartedPulling="2026-03-08 03:39:04.509947016 +0000 UTC m=+1546.588714233" lastFinishedPulling="2026-03-08 03:39:08.293447273 +0000 UTC m=+1550.372214490" observedRunningTime="2026-03-08 03:39:09.120853944 +0000 UTC m=+1551.199621161" watchObservedRunningTime="2026-03-08 03:39:09.121204374 +0000 UTC m=+1551.199971591" Mar 08 03:39:09.123053 master-0 kubenswrapper[13046]: I0308 03:39:09.123020 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-65fa-account-create-update-rktss" event={"ID":"5f9dfc00-d4d2-454d-b76f-076660d4f9e2","Type":"ContainerStarted","Data":"ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d"} Mar 08 03:39:09.170894 master-0 kubenswrapper[13046]: I0308 03:39:09.161423 13046 generic.go:334] "Generic (PLEG): container finished" podID="d093b57a-247f-4d76-8ad2-659f459f5f1a" containerID="f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20" exitCode=1 Mar 08 03:39:09.170894 master-0 kubenswrapper[13046]: I0308 03:39:09.161535 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerDied","Data":"f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20"} Mar 08 03:39:09.170894 master-0 kubenswrapper[13046]: I0308 03:39:09.162433 13046 scope.go:117] "RemoveContainer" containerID="f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20" Mar 08 03:39:09.170894 master-0 kubenswrapper[13046]: E0308 03:39:09.162713 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-9f967cb96-7vpvw_openstack(d093b57a-247f-4d76-8ad2-659f459f5f1a)\"" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" podUID="d093b57a-247f-4d76-8ad2-659f459f5f1a" Mar 08 03:39:09.179627 master-0 kubenswrapper[13046]: I0308 03:39:09.173073 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pdtcb" event={"ID":"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11","Type":"ContainerStarted","Data":"b8b243df0138b7faa36db1f9461882c3564ff5b85963de2967a2eccfde298963"} Mar 08 03:39:09.179627 master-0 kubenswrapper[13046]: I0308 03:39:09.174026 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2420-account-create-update-mb88p"] Mar 08 03:39:09.179627 master-0 kubenswrapper[13046]: I0308 03:39:09.175173 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6x7dc" event={"ID":"db81925e-2987-49d2-a511-33a9f40ddd8c","Type":"ContainerStarted","Data":"a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1"} Mar 08 03:39:09.202504 master-0 kubenswrapper[13046]: I0308 03:39:09.200918 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:39:09.228362 master-0 kubenswrapper[13046]: I0308 03:39:09.226436 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67766cb894-9qmph"] Mar 08 03:39:09.247242 master-0 kubenswrapper[13046]: I0308 03:39:09.246170 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4cfc-account-create-update-sf2rb"] Mar 08 03:39:09.465618 master-0 kubenswrapper[13046]: I0308 03:39:09.461538 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-765dd9b47b-xzxwf"] Mar 08 03:39:10.148368 master-0 kubenswrapper[13046]: I0308 03:39:10.148315 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" path="/var/lib/kubelet/pods/f3134df2-c86a-46bb-89ca-3598293b4695/volumes" Mar 08 03:39:10.237258 master-0 kubenswrapper[13046]: I0308 03:39:10.236951 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pdtcb" event={"ID":"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11","Type":"ContainerStarted","Data":"364f37c6908869497f0c2bd6e4b77df4d1c50c84eb6f4b98ccf89b3ea1b682f3"} Mar 08 03:39:10.240062 master-0 kubenswrapper[13046]: I0308 03:39:10.239966 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6x7dc" event={"ID":"db81925e-2987-49d2-a511-33a9f40ddd8c","Type":"ContainerStarted","Data":"2c55c4fb9c09ccade0b205372432f0fe3032f53ceb514771f77876852142d13c"} Mar 08 03:39:10.260511 master-0 kubenswrapper[13046]: I0308 03:39:10.258944 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pdtcb" podStartSLOduration=4.258930161 podStartE2EDuration="4.258930161s" podCreationTimestamp="2026-03-08 03:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:10.258124948 +0000 UTC m=+1552.336892155" watchObservedRunningTime="2026-03-08 03:39:10.258930161 +0000 UTC m=+1552.337697378" Mar 08 03:39:10.290506 master-0 kubenswrapper[13046]: I0308 03:39:10.289624 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-6x7dc" podStartSLOduration=4.289606623 podStartE2EDuration="4.289606623s" podCreationTimestamp="2026-03-08 03:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:10.281792161 +0000 UTC m=+1552.360559378" watchObservedRunningTime="2026-03-08 03:39:10.289606623 +0000 UTC m=+1552.368373840" Mar 08 03:39:10.781083 master-0 kubenswrapper[13046]: I0308 03:39:10.780664 13046 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:10.784582 master-0 kubenswrapper[13046]: I0308 03:39:10.782668 13046 scope.go:117] "RemoveContainer" containerID="f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20" Mar 08 03:39:10.784582 master-0 kubenswrapper[13046]: E0308 03:39:10.783196 13046 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-9f967cb96-7vpvw_openstack(d093b57a-247f-4d76-8ad2-659f459f5f1a)\"" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" podUID="d093b57a-247f-4d76-8ad2-659f459f5f1a" Mar 08 03:39:12.066148 master-0 kubenswrapper[13046]: I0308 03:39:12.066071 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:12.066614 master-0 kubenswrapper[13046]: I0308 03:39:12.066381 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" containerID="cri-o://adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d" gracePeriod=30 Mar 08 03:39:12.066724 master-0 kubenswrapper[13046]: I0308 03:39:12.066674 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" containerID="cri-o://2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab" gracePeriod=30 Mar 08 03:39:12.075011 master-0 kubenswrapper[13046]: I0308 03:39:12.074948 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" probeResult="failure" output="Get \"https://10.128.0.215:9292/healthcheck\": EOF" Mar 08 03:39:12.075123 master-0 kubenswrapper[13046]: I0308 03:39:12.075100 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" probeResult="failure" output="Get \"https://10.128.0.215:9292/healthcheck\": EOF" Mar 08 03:39:12.085520 master-0 kubenswrapper[13046]: I0308 03:39:12.083390 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.128.0.215:9292/healthcheck\": EOF" Mar 08 03:39:12.085520 master-0 kubenswrapper[13046]: I0308 03:39:12.083385 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-bf784-default-external-api-0" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.128.0.215:9292/healthcheck\": EOF" Mar 08 03:39:14.368579 master-0 kubenswrapper[13046]: I0308 03:39:14.367939 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:14.368579 master-0 kubenswrapper[13046]: I0308 03:39:14.368268 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-internal-api-0" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-log" containerID="cri-o://2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" gracePeriod=30 Mar 08 03:39:14.368579 master-0 kubenswrapper[13046]: I0308 03:39:14.368412 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-bf784-default-internal-api-0" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-httpd" containerID="cri-o://bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" gracePeriod=30 Mar 08 03:39:18.429719 master-0 kubenswrapper[13046]: I0308 03:39:18.429408 13046 generic.go:334] "Generic (PLEG): container finished" podID="3e537f07-7fd3-4505-8490-b028c741c650" containerID="2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab" exitCode=0 Mar 08 03:39:18.429719 master-0 kubenswrapper[13046]: I0308 03:39:18.429504 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerDied","Data":"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab"} Mar 08 03:39:18.432162 master-0 kubenswrapper[13046]: I0308 03:39:18.432136 13046 generic.go:334] "Generic (PLEG): container finished" podID="327283ac-a839-484a-a4aa-daf30c72b9f4" containerID="0cc79d5e6a4a278bdfa5a571e1ccea4389592109042ae880cffd0667a3d11640" exitCode=0 Mar 08 03:39:18.432243 master-0 kubenswrapper[13046]: I0308 03:39:18.432176 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rcvjv" event={"ID":"327283ac-a839-484a-a4aa-daf30c72b9f4","Type":"ContainerDied","Data":"0cc79d5e6a4a278bdfa5a571e1ccea4389592109042ae880cffd0667a3d11640"} Mar 08 03:39:21.567379 master-0 kubenswrapper[13046]: W0308 03:39:21.567018 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6603220_12c5_4879_a72d_f12a27e7ed84.slice/crio-cf97fb3f7f1e50e814a90a8382ae11baf14fb3f32a7428704064ba7b67bef15d WatchSource:0}: Error finding container cf97fb3f7f1e50e814a90a8382ae11baf14fb3f32a7428704064ba7b67bef15d: Status 404 returned error can't find the container with id cf97fb3f7f1e50e814a90a8382ae11baf14fb3f32a7428704064ba7b67bef15d Mar 08 03:39:21.591302 master-0 kubenswrapper[13046]: W0308 03:39:21.591240 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38bc77fe_5af2_4fe1_b7d5_321250be828e.slice/crio-3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6 WatchSource:0}: Error finding container 3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6: Status 404 returned error can't find the container with id 3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6 Mar 08 03:39:21.602590 master-0 kubenswrapper[13046]: I0308 03:39:21.598259 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 03:39:21.684653 master-0 kubenswrapper[13046]: I0308 03:39:21.684470 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 03:39:21.710926 master-0 kubenswrapper[13046]: I0308 03:39:21.710761 13046 scope.go:117] "RemoveContainer" containerID="4f9d048d579f237ee3fe87b15b98098d4dd4101762f67f7139a1ec5c7df66f1d" Mar 08 03:39:21.807348 master-0 kubenswrapper[13046]: I0308 03:39:21.807306 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:21.833093 master-0 kubenswrapper[13046]: I0308 03:39:21.832244 13046 scope.go:117] "RemoveContainer" containerID="4c2490c757bd4af990c59c840fc0cceb1cf1204f9198bcc6a46f24a8b703cd6d" Mar 08 03:39:21.904580 master-0 kubenswrapper[13046]: I0308 03:39:21.904531 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rldcz\" (UniqueName: \"kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.904714 master-0 kubenswrapper[13046]: I0308 03:39:21.904599 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.904714 master-0 kubenswrapper[13046]: I0308 03:39:21.904675 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.904833 master-0 kubenswrapper[13046]: I0308 03:39:21.904769 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.904833 master-0 kubenswrapper[13046]: I0308 03:39:21.904820 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.904927 master-0 kubenswrapper[13046]: I0308 03:39:21.904899 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.905328 master-0 kubenswrapper[13046]: I0308 03:39:21.904984 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config\") pod \"327283ac-a839-484a-a4aa-daf30c72b9f4\" (UID: \"327283ac-a839-484a-a4aa-daf30c72b9f4\") " Mar 08 03:39:21.905825 master-0 kubenswrapper[13046]: I0308 03:39:21.905795 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:21.906092 master-0 kubenswrapper[13046]: I0308 03:39:21.906062 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:21.907592 master-0 kubenswrapper[13046]: I0308 03:39:21.907534 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:21.907853 master-0 kubenswrapper[13046]: I0308 03:39:21.907584 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/327283ac-a839-484a-a4aa-daf30c72b9f4-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:21.909834 master-0 kubenswrapper[13046]: I0308 03:39:21.909790 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 03:39:21.910361 master-0 kubenswrapper[13046]: I0308 03:39:21.910319 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts" (OuterVolumeSpecName: "scripts") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:21.915940 master-0 kubenswrapper[13046]: I0308 03:39:21.915846 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz" (OuterVolumeSpecName: "kube-api-access-rldcz") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "kube-api-access-rldcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:22.010650 master-0 kubenswrapper[13046]: I0308 03:39:22.009911 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.010650 master-0 kubenswrapper[13046]: I0308 03:39:22.009965 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rldcz\" (UniqueName: \"kubernetes.io/projected/327283ac-a839-484a-a4aa-daf30c72b9f4-kube-api-access-rldcz\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.010650 master-0 kubenswrapper[13046]: I0308 03:39:22.009983 13046 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/327283ac-a839-484a-a4aa-daf30c72b9f4-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.107627 master-0 kubenswrapper[13046]: I0308 03:39:22.103893 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.121413 master-0 kubenswrapper[13046]: I0308 03:39:22.118147 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.141571 master-0 kubenswrapper[13046]: I0308 03:39:22.141456 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config" (OuterVolumeSpecName: "config") pod "327283ac-a839-484a-a4aa-daf30c72b9f4" (UID: "327283ac-a839-484a-a4aa-daf30c72b9f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.187715 master-0 kubenswrapper[13046]: I0308 03:39:22.187654 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.224033 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.224989 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225058 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225135 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225166 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225229 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l86c7\" (UniqueName: \"kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225287 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.225518 master-0 kubenswrapper[13046]: I0308 03:39:22.225341 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.226376 master-0 kubenswrapper[13046]: I0308 03:39:22.225960 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/327283ac-a839-484a-a4aa-daf30c72b9f4-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.229495 master-0 kubenswrapper[13046]: I0308 03:39:22.229291 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs" (OuterVolumeSpecName: "logs") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:22.233499 master-0 kubenswrapper[13046]: I0308 03:39:22.232262 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:22.276303 master-0 kubenswrapper[13046]: I0308 03:39:22.275999 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts" (OuterVolumeSpecName: "scripts") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.276571 master-0 kubenswrapper[13046]: I0308 03:39:22.276438 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82" (OuterVolumeSpecName: "glance") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 03:39:22.286547 master-0 kubenswrapper[13046]: I0308 03:39:22.277994 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7" (OuterVolumeSpecName: "kube-api-access-l86c7") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "kube-api-access-l86c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:22.304561 master-0 kubenswrapper[13046]: I0308 03:39:22.299361 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328721 13046 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" " Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328767 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328779 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328790 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l86c7\" (UniqueName: \"kubernetes.io/projected/3e537f07-7fd3-4505-8490-b028c741c650-kube-api-access-l86c7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328800 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.330755 master-0 kubenswrapper[13046]: I0308 03:39:22.328809 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e537f07-7fd3-4505-8490-b028c741c650-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.367543 master-0 kubenswrapper[13046]: I0308 03:39:22.366722 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:22.444710 master-0 kubenswrapper[13046]: I0308 03:39:22.442530 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data" (OuterVolumeSpecName: "config-data") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.477691 master-0 kubenswrapper[13046]: I0308 03:39:22.475061 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.496577 master-0 kubenswrapper[13046]: I0308 03:39:22.486398 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") pod \"3e537f07-7fd3-4505-8490-b028c741c650\" (UID: \"3e537f07-7fd3-4505-8490-b028c741c650\") " Mar 08 03:39:22.496577 master-0 kubenswrapper[13046]: W0308 03:39:22.489302 13046 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3e537f07-7fd3-4505-8490-b028c741c650/volumes/kubernetes.io~secret/public-tls-certs Mar 08 03:39:22.496577 master-0 kubenswrapper[13046]: I0308 03:39:22.489333 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e537f07-7fd3-4505-8490-b028c741c650" (UID: "3e537f07-7fd3-4505-8490-b028c741c650"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.553831 master-0 kubenswrapper[13046]: I0308 03:39:22.553781 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.554053 master-0 kubenswrapper[13046]: I0308 03:39:22.553888 13046 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e537f07-7fd3-4505-8490-b028c741c650-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.648028 master-0 kubenswrapper[13046]: I0308 03:39:22.647949 13046 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 03:39:22.648532 master-0 kubenswrapper[13046]: I0308 03:39:22.648155 13046 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd" (UniqueName: "kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82") on node "master-0" Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658079 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658251 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658334 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft526\" (UniqueName: \"kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658385 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658423 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658480 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658521 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658545 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data\") pod \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\" (UID: \"ea8edfcc-d36d-445e-918b-38a71b2cafa4\") " Mar 08 03:39:22.662334 master-0 kubenswrapper[13046]: I0308 03:39:22.658867 13046 reconciler_common.go:293] "Volume detached for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.663029 master-0 kubenswrapper[13046]: I0308 03:39:22.662983 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:22.670521 master-0 kubenswrapper[13046]: I0308 03:39:22.668635 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs" (OuterVolumeSpecName: "logs") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:22.685238 master-0 kubenswrapper[13046]: I0308 03:39:22.685192 13046 generic.go:334] "Generic (PLEG): container finished" podID="3e537f07-7fd3-4505-8490-b028c741c650" containerID="adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d" exitCode=143 Mar 08 03:39:22.685352 master-0 kubenswrapper[13046]: I0308 03:39:22.685285 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerDied","Data":"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d"} Mar 08 03:39:22.685352 master-0 kubenswrapper[13046]: I0308 03:39:22.685313 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"3e537f07-7fd3-4505-8490-b028c741c650","Type":"ContainerDied","Data":"df8d8a8dac7d8a5e29c321f37c2004c3b928c131bbee03ea5ffc898a7587c970"} Mar 08 03:39:22.685352 master-0 kubenswrapper[13046]: I0308 03:39:22.685333 13046 scope.go:117] "RemoveContainer" containerID="2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab" Mar 08 03:39:22.685507 master-0 kubenswrapper[13046]: I0308 03:39:22.685466 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:22.692326 master-0 kubenswrapper[13046]: I0308 03:39:22.692273 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-765dd9b47b-xzxwf" event={"ID":"d6603220-12c5-4879-a72d-f12a27e7ed84","Type":"ContainerStarted","Data":"eb832dad2ac2a7f422e25495fe4a1ac9213c8c1b35281b5f9c622b1af2ba1aea"} Mar 08 03:39:22.692390 master-0 kubenswrapper[13046]: I0308 03:39:22.692329 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-765dd9b47b-xzxwf" event={"ID":"d6603220-12c5-4879-a72d-f12a27e7ed84","Type":"ContainerStarted","Data":"cf97fb3f7f1e50e814a90a8382ae11baf14fb3f32a7428704064ba7b67bef15d"} Mar 08 03:39:22.694304 master-0 kubenswrapper[13046]: I0308 03:39:22.694255 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2420-account-create-update-mb88p" event={"ID":"2f3459cf-9e53-466c-906d-6a9033b782f1","Type":"ContainerStarted","Data":"2f3414ba3620c1892aedff07a5a4ed1960c34f3a19ab0fa0b736b988f75cf166"} Mar 08 03:39:22.694304 master-0 kubenswrapper[13046]: I0308 03:39:22.694303 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2420-account-create-update-mb88p" event={"ID":"2f3459cf-9e53-466c-906d-6a9033b782f1","Type":"ContainerStarted","Data":"eeab13b27c71160d81630fab08f9be3b6945f8ac8ebd20056cc012cea0a993c7"} Mar 08 03:39:22.706588 master-0 kubenswrapper[13046]: I0308 03:39:22.706284 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526" (OuterVolumeSpecName: "kube-api-access-ft526") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "kube-api-access-ft526". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:22.706588 master-0 kubenswrapper[13046]: I0308 03:39:22.706344 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts" (OuterVolumeSpecName: "scripts") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.707118 master-0 kubenswrapper[13046]: I0308 03:39:22.707071 13046 generic.go:334] "Generic (PLEG): container finished" podID="db81925e-2987-49d2-a511-33a9f40ddd8c" containerID="2c55c4fb9c09ccade0b205372432f0fe3032f53ceb514771f77876852142d13c" exitCode=0 Mar 08 03:39:22.707177 master-0 kubenswrapper[13046]: I0308 03:39:22.707151 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6x7dc" event={"ID":"db81925e-2987-49d2-a511-33a9f40ddd8c","Type":"ContainerDied","Data":"2c55c4fb9c09ccade0b205372432f0fe3032f53ceb514771f77876852142d13c"} Mar 08 03:39:22.727613 master-0 kubenswrapper[13046]: I0308 03:39:22.727546 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" event={"ID":"38bc77fe-5af2-4fe1-b7d5-321250be828e","Type":"ContainerStarted","Data":"011294082c37a5329c4e22fb7c0aef8f312655fabcd99e66c51dbe3fd5a0d2e3"} Mar 08 03:39:22.727613 master-0 kubenswrapper[13046]: I0308 03:39:22.727601 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" event={"ID":"38bc77fe-5af2-4fe1-b7d5-321250be828e","Type":"ContainerStarted","Data":"3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6"} Mar 08 03:39:22.734828 master-0 kubenswrapper[13046]: I0308 03:39:22.733689 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0" (OuterVolumeSpecName: "glance") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 03:39:22.740492 master-0 kubenswrapper[13046]: I0308 03:39:22.740419 13046 generic.go:334] "Generic (PLEG): container finished" podID="a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" containerID="364f37c6908869497f0c2bd6e4b77df4d1c50c84eb6f4b98ccf89b3ea1b682f3" exitCode=0 Mar 08 03:39:22.740711 master-0 kubenswrapper[13046]: I0308 03:39:22.740539 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pdtcb" event={"ID":"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11","Type":"ContainerDied","Data":"364f37c6908869497f0c2bd6e4b77df4d1c50c84eb6f4b98ccf89b3ea1b682f3"} Mar 08 03:39:22.745894 master-0 kubenswrapper[13046]: I0308 03:39:22.744136 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-2420-account-create-update-mb88p" podStartSLOduration=16.744113854 podStartE2EDuration="16.744113854s" podCreationTimestamp="2026-03-08 03:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:22.7263734 +0000 UTC m=+1564.805140617" watchObservedRunningTime="2026-03-08 03:39:22.744113854 +0000 UTC m=+1564.822881071" Mar 08 03:39:22.745894 master-0 kubenswrapper[13046]: I0308 03:39:22.745708 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rcvjv" event={"ID":"327283ac-a839-484a-a4aa-daf30c72b9f4","Type":"ContainerDied","Data":"edc2c1f7706e71a434e13c55067567281ca679bdda51f07085c9a3275192c957"} Mar 08 03:39:22.745894 master-0 kubenswrapper[13046]: I0308 03:39:22.745748 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc2c1f7706e71a434e13c55067567281ca679bdda51f07085c9a3275192c957" Mar 08 03:39:22.745894 master-0 kubenswrapper[13046]: I0308 03:39:22.745822 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rcvjv" Mar 08 03:39:22.752734 master-0 kubenswrapper[13046]: I0308 03:39:22.751621 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-65fa-account-create-update-rktss" event={"ID":"5f9dfc00-d4d2-454d-b76f-076660d4f9e2","Type":"ContainerStarted","Data":"4279058fa337d712d6af823b86ef9aaef600530db00fb1400bb2661a453c5f26"} Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758547 13046 generic.go:334] "Generic (PLEG): container finished" podID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerID="bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" exitCode=0 Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758579 13046 generic.go:334] "Generic (PLEG): container finished" podID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerID="2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" exitCode=143 Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758625 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerDied","Data":"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439"} Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758648 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerDied","Data":"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb"} Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758657 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"ea8edfcc-d36d-445e-918b-38a71b2cafa4","Type":"ContainerDied","Data":"abbbe860644150bd69db3be6ba3f97c1a9ff4d0f279fe5b9b62e56433650fc51"} Mar 08 03:39:22.761506 master-0 kubenswrapper[13046]: I0308 03:39:22.758724 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:22.763103 master-0 kubenswrapper[13046]: I0308 03:39:22.762547 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.763103 master-0 kubenswrapper[13046]: I0308 03:39:22.762591 13046 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") on node \"master-0\" " Mar 08 03:39:22.763103 master-0 kubenswrapper[13046]: I0308 03:39:22.762603 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft526\" (UniqueName: \"kubernetes.io/projected/ea8edfcc-d36d-445e-918b-38a71b2cafa4-kube-api-access-ft526\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.763103 master-0 kubenswrapper[13046]: I0308 03:39:22.762615 13046 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.763103 master-0 kubenswrapper[13046]: I0308 03:39:22.762623 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea8edfcc-d36d-445e-918b-38a71b2cafa4-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.767631 master-0 kubenswrapper[13046]: I0308 03:39:22.766696 13046 generic.go:334] "Generic (PLEG): container finished" podID="7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" containerID="de5ab68dcd6866ba06ec2863e3e418b43b43b2b583d1232424b8d5115760880a" exitCode=0 Mar 08 03:39:22.767631 master-0 kubenswrapper[13046]: I0308 03:39:22.766777 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xds8t" event={"ID":"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682","Type":"ContainerDied","Data":"de5ab68dcd6866ba06ec2863e3e418b43b43b2b583d1232424b8d5115760880a"} Mar 08 03:39:22.793860 master-0 kubenswrapper[13046]: I0308 03:39:22.793777 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" podStartSLOduration=16.793757615 podStartE2EDuration="16.793757615s" podCreationTimestamp="2026-03-08 03:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:22.785085188 +0000 UTC m=+1564.863852405" watchObservedRunningTime="2026-03-08 03:39:22.793757615 +0000 UTC m=+1564.872524822" Mar 08 03:39:22.803870 master-0 kubenswrapper[13046]: I0308 03:39:22.803787 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-65fa-account-create-update-rktss" podStartSLOduration=16.803764939 podStartE2EDuration="16.803764939s" podCreationTimestamp="2026-03-08 03:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:22.802530674 +0000 UTC m=+1564.881297891" watchObservedRunningTime="2026-03-08 03:39:22.803764939 +0000 UTC m=+1564.882532156" Mar 08 03:39:22.888702 master-0 kubenswrapper[13046]: I0308 03:39:22.888649 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.890637 master-0 kubenswrapper[13046]: I0308 03:39:22.890369 13046 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 03:39:22.891205 master-0 kubenswrapper[13046]: I0308 03:39:22.891193 13046 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c" (UniqueName: "kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0") on node "master-0" Mar 08 03:39:22.894023 master-0 kubenswrapper[13046]: I0308 03:39:22.893974 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.894023 master-0 kubenswrapper[13046]: I0308 03:39:22.894019 13046 reconciler_common.go:293] "Volume detached for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.922504 master-0 kubenswrapper[13046]: I0308 03:39:22.922037 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.944031 master-0 kubenswrapper[13046]: I0308 03:39:22.943953 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data" (OuterVolumeSpecName: "config-data") pod "ea8edfcc-d36d-445e-918b-38a71b2cafa4" (UID: "ea8edfcc-d36d-445e-918b-38a71b2cafa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:22.997078 master-0 kubenswrapper[13046]: I0308 03:39:22.996894 13046 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:22.997078 master-0 kubenswrapper[13046]: I0308 03:39:22.996949 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea8edfcc-d36d-445e-918b-38a71b2cafa4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:23.119421 master-0 kubenswrapper[13046]: I0308 03:39:23.119380 13046 scope.go:117] "RemoveContainer" containerID="f5b66ecee10aa226dda3d240a3ade7c8d45753a9479e69dc2ec46a736fc65d20" Mar 08 03:39:23.252940 master-0 kubenswrapper[13046]: I0308 03:39:23.252879 13046 scope.go:117] "RemoveContainer" containerID="adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d" Mar 08 03:39:23.315046 master-0 kubenswrapper[13046]: I0308 03:39:23.314826 13046 scope.go:117] "RemoveContainer" containerID="2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab" Mar 08 03:39:23.316241 master-0 kubenswrapper[13046]: E0308 03:39:23.316070 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab\": container with ID starting with 2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab not found: ID does not exist" containerID="2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab" Mar 08 03:39:23.316241 master-0 kubenswrapper[13046]: I0308 03:39:23.316117 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab"} err="failed to get container status \"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab\": rpc error: code = NotFound desc = could not find container \"2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab\": container with ID starting with 2434afdd988f1a7d2e8a98b45117c7a94005d1387fe1836f6e7f3b32fa9dc4ab not found: ID does not exist" Mar 08 03:39:23.316241 master-0 kubenswrapper[13046]: I0308 03:39:23.316143 13046 scope.go:117] "RemoveContainer" containerID="adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d" Mar 08 03:39:23.316506 master-0 kubenswrapper[13046]: E0308 03:39:23.316446 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d\": container with ID starting with adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d not found: ID does not exist" containerID="adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d" Mar 08 03:39:23.316622 master-0 kubenswrapper[13046]: I0308 03:39:23.316575 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d"} err="failed to get container status \"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d\": rpc error: code = NotFound desc = could not find container \"adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d\": container with ID starting with adea6ef34005a0711f0485b7f15d05a2bb3d91f979fdf0cc5a623c6221675e3d not found: ID does not exist" Mar 08 03:39:23.316622 master-0 kubenswrapper[13046]: I0308 03:39:23.316619 13046 scope.go:117] "RemoveContainer" containerID="bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" Mar 08 03:39:23.317191 master-0 kubenswrapper[13046]: I0308 03:39:23.317150 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:23.344573 master-0 kubenswrapper[13046]: I0308 03:39:23.344538 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:23.359251 master-0 kubenswrapper[13046]: I0308 03:39:23.359128 13046 scope.go:117] "RemoveContainer" containerID="2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" Mar 08 03:39:23.363426 master-0 kubenswrapper[13046]: I0308 03:39:23.363385 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:23.402893 master-0 kubenswrapper[13046]: I0308 03:39:23.400538 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:23.415856 master-0 kubenswrapper[13046]: I0308 03:39:23.415413 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416208 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327283ac-a839-484a-a4aa-daf30c72b9f4" containerName="ironic-inspector-db-sync" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416230 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="327283ac-a839-484a-a4aa-daf30c72b9f4" containerName="ironic-inspector-db-sync" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416250 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416256 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416271 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-log" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416276 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-log" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416300 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416306 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416320 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416328 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416357 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416362 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: E0308 03:39:23.416372 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-api" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416378 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-api" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416615 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-api" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416643 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416653 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" containerName="glance-log" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416670 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="327283ac-a839-484a-a4aa-daf30c72b9f4" containerName="ironic-inspector-db-sync" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416680 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416698 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3134df2-c86a-46bb-89ca-3598293b4695" containerName="neutron-httpd" Mar 08 03:39:23.420970 master-0 kubenswrapper[13046]: I0308 03:39:23.416711 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e537f07-7fd3-4505-8490-b028c741c650" containerName="glance-log" Mar 08 03:39:23.431700 master-0 kubenswrapper[13046]: I0308 03:39:23.423682 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.438695 master-0 kubenswrapper[13046]: I0308 03:39:23.436801 13046 scope.go:117] "RemoveContainer" containerID="bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" Mar 08 03:39:23.438695 master-0 kubenswrapper[13046]: I0308 03:39:23.436814 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-internal-config-data" Mar 08 03:39:23.438695 master-0 kubenswrapper[13046]: I0308 03:39:23.436891 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 03:39:23.438695 master-0 kubenswrapper[13046]: I0308 03:39:23.436855 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 03:39:23.438695 master-0 kubenswrapper[13046]: E0308 03:39:23.437911 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439\": container with ID starting with bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439 not found: ID does not exist" containerID="bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" Mar 08 03:39:23.441147 master-0 kubenswrapper[13046]: I0308 03:39:23.439696 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439"} err="failed to get container status \"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439\": rpc error: code = NotFound desc = could not find container \"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439\": container with ID starting with bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439 not found: ID does not exist" Mar 08 03:39:23.441147 master-0 kubenswrapper[13046]: I0308 03:39:23.439737 13046 scope.go:117] "RemoveContainer" containerID="2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" Mar 08 03:39:23.441544 master-0 kubenswrapper[13046]: E0308 03:39:23.441444 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb\": container with ID starting with 2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb not found: ID does not exist" containerID="2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" Mar 08 03:39:23.441544 master-0 kubenswrapper[13046]: I0308 03:39:23.441501 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb"} err="failed to get container status \"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb\": rpc error: code = NotFound desc = could not find container \"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb\": container with ID starting with 2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb not found: ID does not exist" Mar 08 03:39:23.441544 master-0 kubenswrapper[13046]: I0308 03:39:23.441525 13046 scope.go:117] "RemoveContainer" containerID="bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439" Mar 08 03:39:23.442823 master-0 kubenswrapper[13046]: I0308 03:39:23.442797 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439"} err="failed to get container status \"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439\": rpc error: code = NotFound desc = could not find container \"bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439\": container with ID starting with bfa96f55d28658db78ab80546eb873a1574ada82ab78562634fb27368daf6439 not found: ID does not exist" Mar 08 03:39:23.442823 master-0 kubenswrapper[13046]: I0308 03:39:23.442819 13046 scope.go:117] "RemoveContainer" containerID="2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb" Mar 08 03:39:23.443152 master-0 kubenswrapper[13046]: I0308 03:39:23.443130 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb"} err="failed to get container status \"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb\": rpc error: code = NotFound desc = could not find container \"2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb\": container with ID starting with 2fd839350234c761310fd9be06b64e3a5f1bdbd550ccdba3605d0dc64d3e9dbb not found: ID does not exist" Mar 08 03:39:23.492052 master-0 kubenswrapper[13046]: I0308 03:39:23.491911 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:23.508781 master-0 kubenswrapper[13046]: I0308 03:39:23.503020 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.509631 master-0 kubenswrapper[13046]: I0308 03:39:23.509473 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-bf784-default-external-config-data" Mar 08 03:39:23.512830 master-0 kubenswrapper[13046]: I0308 03:39:23.512791 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544655 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544746 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544793 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544819 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544882 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.544942 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.545011 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.545084 master-0 kubenswrapper[13046]: I0308 03:39:23.545064 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25glv\" (UniqueName: \"kubernetes.io/projected/98c075f3-f193-415e-a94d-d5ee77a6738b-kube-api-access-25glv\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.574500 master-0 kubenswrapper[13046]: I0308 03:39:23.574030 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:23.593732 master-0 kubenswrapper[13046]: I0308 03:39:23.590569 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:23.631468 master-0 kubenswrapper[13046]: I0308 03:39:23.631399 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:39:23.635701 master-0 kubenswrapper[13046]: I0308 03:39:23.635294 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.647472 master-0 kubenswrapper[13046]: I0308 03:39:23.647401 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.648555 master-0 kubenswrapper[13046]: I0308 03:39:23.648517 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648623 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxq6r\" (UniqueName: \"kubernetes.io/projected/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-kube-api-access-xxq6r\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648669 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648691 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648724 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648757 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648780 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648810 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25glv\" (UniqueName: \"kubernetes.io/projected/98c075f3-f193-415e-a94d-d5ee77a6738b-kube-api-access-25glv\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.648915 master-0 kubenswrapper[13046]: I0308 03:39:23.648863 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.648945 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.649008 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.649058 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.649075 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.649099 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.649165 master-0 kubenswrapper[13046]: I0308 03:39:23.649120 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.649775 master-0 kubenswrapper[13046]: I0308 03:39:23.649564 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-httpd-run\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.656165 master-0 kubenswrapper[13046]: I0308 03:39:23.655028 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-combined-ca-bundle\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.656255 master-0 kubenswrapper[13046]: I0308 03:39:23.656212 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:39:23.656304 master-0 kubenswrapper[13046]: I0308 03:39:23.656253 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7ee56d67d41b92c054773f64bf7346771894ec4c4c18aa0117ea2564b4c6d4a8/globalmount\"" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.656769 master-0 kubenswrapper[13046]: I0308 03:39:23.656733 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c075f3-f193-415e-a94d-d5ee77a6738b-logs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.658275 master-0 kubenswrapper[13046]: I0308 03:39:23.658246 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:39:23.660225 master-0 kubenswrapper[13046]: I0308 03:39:23.659847 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-config-data\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.662609 master-0 kubenswrapper[13046]: I0308 03:39:23.662585 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-scripts\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.678917 master-0 kubenswrapper[13046]: I0308 03:39:23.664890 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c075f3-f193-415e-a94d-d5ee77a6738b-public-tls-certs\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.696358 master-0 kubenswrapper[13046]: I0308 03:39:23.694155 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25glv\" (UniqueName: \"kubernetes.io/projected/98c075f3-f193-415e-a94d-d5ee77a6738b-kube-api-access-25glv\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:23.714609 master-0 kubenswrapper[13046]: I0308 03:39:23.714545 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:23.729502 master-0 kubenswrapper[13046]: I0308 03:39:23.725817 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:23.729502 master-0 kubenswrapper[13046]: I0308 03:39:23.727554 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 03:39:23.729502 master-0 kubenswrapper[13046]: I0308 03:39:23.729431 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 03:39:23.735498 master-0 kubenswrapper[13046]: I0308 03:39:23.732162 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:23.739503 master-0 kubenswrapper[13046]: I0308 03:39:23.735627 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.752938 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753018 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753086 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753145 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxq6r\" (UniqueName: \"kubernetes.io/projected/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-kube-api-access-xxq6r\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753177 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753235 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753263 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753297 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753340 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753360 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngcvt\" (UniqueName: \"kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753410 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753444 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753477 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.755416 master-0 kubenswrapper[13046]: I0308 03:39:23.753515 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.757231 master-0 kubenswrapper[13046]: I0308 03:39:23.756613 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-logs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.757231 master-0 kubenswrapper[13046]: I0308 03:39:23.757198 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-internal-tls-certs\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.757328 master-0 kubenswrapper[13046]: I0308 03:39:23.757244 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-httpd-run\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.763424 master-0 kubenswrapper[13046]: I0308 03:39:23.761568 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-scripts\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.768696 master-0 kubenswrapper[13046]: I0308 03:39:23.766874 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-combined-ca-bundle\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.770436 master-0 kubenswrapper[13046]: I0308 03:39:23.770020 13046 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 03:39:23.770436 master-0 kubenswrapper[13046]: I0308 03:39:23.770075 13046 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/46958aa2ee719eeb05ed2819dd5b5bd381312e73d29f62f1a59eb590a4eaa799/globalmount\"" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.774085 master-0 kubenswrapper[13046]: I0308 03:39:23.774053 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-config-data\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.784769 master-0 kubenswrapper[13046]: I0308 03:39:23.783660 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxq6r\" (UniqueName: \"kubernetes.io/projected/3dd0b908-1f05-4fca-89a8-b6eb1f41c33d-kube-api-access-xxq6r\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:23.834477 master-0 kubenswrapper[13046]: I0308 03:39:23.834350 13046 generic.go:334] "Generic (PLEG): container finished" podID="38bc77fe-5af2-4fe1-b7d5-321250be828e" containerID="011294082c37a5329c4e22fb7c0aef8f312655fabcd99e66c51dbe3fd5a0d2e3" exitCode=0 Mar 08 03:39:23.834477 master-0 kubenswrapper[13046]: I0308 03:39:23.834427 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" event={"ID":"38bc77fe-5af2-4fe1-b7d5-321250be828e","Type":"ContainerDied","Data":"011294082c37a5329c4e22fb7c0aef8f312655fabcd99e66c51dbe3fd5a0d2e3"} Mar 08 03:39:23.839616 master-0 kubenswrapper[13046]: I0308 03:39:23.839559 13046 generic.go:334] "Generic (PLEG): container finished" podID="2f3459cf-9e53-466c-906d-6a9033b782f1" containerID="2f3414ba3620c1892aedff07a5a4ed1960c34f3a19ab0fa0b736b988f75cf166" exitCode=0 Mar 08 03:39:23.839817 master-0 kubenswrapper[13046]: I0308 03:39:23.839705 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2420-account-create-update-mb88p" event={"ID":"2f3459cf-9e53-466c-906d-6a9033b782f1","Type":"ContainerDied","Data":"2f3414ba3620c1892aedff07a5a4ed1960c34f3a19ab0fa0b736b988f75cf166"} Mar 08 03:39:23.841066 master-0 kubenswrapper[13046]: I0308 03:39:23.841027 13046 generic.go:334] "Generic (PLEG): container finished" podID="5f9dfc00-d4d2-454d-b76f-076660d4f9e2" containerID="4279058fa337d712d6af823b86ef9aaef600530db00fb1400bb2661a453c5f26" exitCode=0 Mar 08 03:39:23.841132 master-0 kubenswrapper[13046]: I0308 03:39:23.841106 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-65fa-account-create-update-rktss" event={"ID":"5f9dfc00-d4d2-454d-b76f-076660d4f9e2","Type":"ContainerDied","Data":"4279058fa337d712d6af823b86ef9aaef600530db00fb1400bb2661a453c5f26"} Mar 08 03:39:23.849279 master-0 kubenswrapper[13046]: I0308 03:39:23.849233 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2736a83d-7185-4aad-af7a-9b36d243400d","Type":"ContainerStarted","Data":"3a69d17aa7cddf020f91cd0cc15007951f502ffbd69cb6c6fddacff48ad3c9d5"} Mar 08 03:39:23.853040 master-0 kubenswrapper[13046]: I0308 03:39:23.852996 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-765dd9b47b-xzxwf" event={"ID":"d6603220-12c5-4879-a72d-f12a27e7ed84","Type":"ContainerStarted","Data":"ef32c4d99786f61d36137f72bcd2ae268773755f8c3d2bd25e668eb93ff17ac4"} Mar 08 03:39:23.855261 master-0 kubenswrapper[13046]: I0308 03:39:23.854877 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:23.855685 master-0 kubenswrapper[13046]: I0308 03:39:23.855649 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:23.856935 master-0 kubenswrapper[13046]: I0308 03:39:23.856904 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"d79584f6452ae35aeeae3f0d940aaa6c96f4e9af7cf059a9a950b56d7c3263dd"} Mar 08 03:39:23.857872 master-0 kubenswrapper[13046]: I0308 03:39:23.857840 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.857937 master-0 kubenswrapper[13046]: I0308 03:39:23.857911 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.857996 master-0 kubenswrapper[13046]: I0308 03:39:23.857979 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngcvt\" (UniqueName: \"kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.858060 master-0 kubenswrapper[13046]: I0308 03:39:23.858044 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.858126 master-0 kubenswrapper[13046]: I0308 03:39:23.858081 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.858174 master-0 kubenswrapper[13046]: I0308 03:39:23.858134 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.858206 master-0 kubenswrapper[13046]: I0308 03:39:23.858171 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.858206 master-0 kubenswrapper[13046]: I0308 03:39:23.858193 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.859382 master-0 kubenswrapper[13046]: I0308 03:39:23.859338 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.859442 master-0 kubenswrapper[13046]: I0308 03:39:23.859407 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.859586 master-0 kubenswrapper[13046]: I0308 03:39:23.859565 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.861095 master-0 kubenswrapper[13046]: I0308 03:39:23.859665 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.861095 master-0 kubenswrapper[13046]: I0308 03:39:23.859699 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlzw7\" (UniqueName: \"kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.861095 master-0 kubenswrapper[13046]: I0308 03:39:23.859743 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.861095 master-0 kubenswrapper[13046]: I0308 03:39:23.860577 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.869195 master-0 kubenswrapper[13046]: I0308 03:39:23.862705 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.869195 master-0 kubenswrapper[13046]: I0308 03:39:23.867648 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.882629 master-0 kubenswrapper[13046]: I0308 03:39:23.877348 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.885796 master-0 kubenswrapper[13046]: I0308 03:39:23.885500 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" event={"ID":"d093b57a-247f-4d76-8ad2-659f459f5f1a","Type":"ContainerStarted","Data":"8619ae3928a321abbf9035ee20e6b5850a2fa077290ee0359d8a8e2cbef94200"} Mar 08 03:39:23.886466 master-0 kubenswrapper[13046]: I0308 03:39:23.886427 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:23.886654 master-0 kubenswrapper[13046]: I0308 03:39:23.886627 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngcvt\" (UniqueName: \"kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt\") pod \"dnsmasq-dns-76fd8d55b9-c4jcr\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:23.940548 master-0 kubenswrapper[13046]: I0308 03:39:23.937176 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.846146261 podStartE2EDuration="26.937157504s" podCreationTimestamp="2026-03-08 03:38:57 +0000 UTC" firstStartedPulling="2026-03-08 03:38:58.776533103 +0000 UTC m=+1540.855300320" lastFinishedPulling="2026-03-08 03:39:21.867544336 +0000 UTC m=+1563.946311563" observedRunningTime="2026-03-08 03:39:23.905977338 +0000 UTC m=+1565.984744545" watchObservedRunningTime="2026-03-08 03:39:23.937157504 +0000 UTC m=+1566.015924721" Mar 08 03:39:23.954345 master-0 kubenswrapper[13046]: I0308 03:39:23.953910 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-765dd9b47b-xzxwf" podStartSLOduration=16.95388939 podStartE2EDuration="16.95388939s" podCreationTimestamp="2026-03-08 03:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:23.931602787 +0000 UTC m=+1566.010370004" watchObservedRunningTime="2026-03-08 03:39:23.95388939 +0000 UTC m=+1566.032656617" Mar 08 03:39:23.962183 master-0 kubenswrapper[13046]: I0308 03:39:23.962102 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.962183 master-0 kubenswrapper[13046]: I0308 03:39:23.962155 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlzw7\" (UniqueName: \"kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.962183 master-0 kubenswrapper[13046]: I0308 03:39:23.962185 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.964239 master-0 kubenswrapper[13046]: I0308 03:39:23.963939 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.964332 master-0 kubenswrapper[13046]: I0308 03:39:23.964276 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.964620 master-0 kubenswrapper[13046]: I0308 03:39:23.964523 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.964620 master-0 kubenswrapper[13046]: I0308 03:39:23.964559 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.966924 master-0 kubenswrapper[13046]: I0308 03:39:23.966866 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.967407 master-0 kubenswrapper[13046]: I0308 03:39:23.967373 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.971275 master-0 kubenswrapper[13046]: I0308 03:39:23.970905 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.972012 master-0 kubenswrapper[13046]: I0308 03:39:23.971974 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.972620 master-0 kubenswrapper[13046]: I0308 03:39:23.972558 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:23.980343 master-0 kubenswrapper[13046]: I0308 03:39:23.979733 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:24.006041 master-0 kubenswrapper[13046]: I0308 03:39:24.004604 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:24.006041 master-0 kubenswrapper[13046]: I0308 03:39:24.004875 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlzw7\" (UniqueName: \"kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7\") pod \"ironic-inspector-0\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:24.049606 master-0 kubenswrapper[13046]: I0308 03:39:24.048336 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:24.154201 master-0 kubenswrapper[13046]: I0308 03:39:24.143062 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e537f07-7fd3-4505-8490-b028c741c650" path="/var/lib/kubelet/pods/3e537f07-7fd3-4505-8490-b028c741c650/volumes" Mar 08 03:39:24.154201 master-0 kubenswrapper[13046]: I0308 03:39:24.149146 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8edfcc-d36d-445e-918b-38a71b2cafa4" path="/var/lib/kubelet/pods/ea8edfcc-d36d-445e-918b-38a71b2cafa4/volumes" Mar 08 03:39:24.537252 master-0 kubenswrapper[13046]: I0308 03:39:24.537191 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:24.584544 master-0 kubenswrapper[13046]: I0308 03:39:24.584468 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8zfh\" (UniqueName: \"kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh\") pod \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " Mar 08 03:39:24.584544 master-0 kubenswrapper[13046]: I0308 03:39:24.584549 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts\") pod \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\" (UID: \"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682\") " Mar 08 03:39:24.585799 master-0 kubenswrapper[13046]: I0308 03:39:24.585768 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" (UID: "7b8ddcef-7329-4111-b9bd-c4ce0d6bf682"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:24.597573 master-0 kubenswrapper[13046]: I0308 03:39:24.594970 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh" (OuterVolumeSpecName: "kube-api-access-n8zfh") pod "7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" (UID: "7b8ddcef-7329-4111-b9bd-c4ce0d6bf682"). InnerVolumeSpecName "kube-api-access-n8zfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:24.687503 master-0 kubenswrapper[13046]: I0308 03:39:24.687439 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:24.690837 master-0 kubenswrapper[13046]: I0308 03:39:24.688044 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8zfh\" (UniqueName: \"kubernetes.io/projected/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682-kube-api-access-n8zfh\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:24.918503 master-0 kubenswrapper[13046]: I0308 03:39:24.909051 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:24.945520 master-0 kubenswrapper[13046]: I0308 03:39:24.935474 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:39:24.992789 master-0 kubenswrapper[13046]: I0308 03:39:24.990347 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:24.992789 master-0 kubenswrapper[13046]: I0308 03:39:24.991353 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pdtcb" Mar 08 03:39:24.992789 master-0 kubenswrapper[13046]: I0308 03:39:24.991790 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pdtcb" event={"ID":"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11","Type":"ContainerDied","Data":"b8b243df0138b7faa36db1f9461882c3564ff5b85963de2967a2eccfde298963"} Mar 08 03:39:24.992789 master-0 kubenswrapper[13046]: I0308 03:39:24.991850 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b243df0138b7faa36db1f9461882c3564ff5b85963de2967a2eccfde298963" Mar 08 03:39:25.018553 master-0 kubenswrapper[13046]: I0308 03:39:25.007848 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts\") pod \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " Mar 08 03:39:25.018553 master-0 kubenswrapper[13046]: I0308 03:39:25.008074 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lc88\" (UniqueName: \"kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88\") pod \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\" (UID: \"a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11\") " Mar 08 03:39:25.018553 master-0 kubenswrapper[13046]: I0308 03:39:25.008535 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" (UID: "a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:25.018553 master-0 kubenswrapper[13046]: I0308 03:39:25.009383 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:25.033340 master-0 kubenswrapper[13046]: I0308 03:39:25.024203 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6x7dc" event={"ID":"db81925e-2987-49d2-a511-33a9f40ddd8c","Type":"ContainerDied","Data":"a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1"} Mar 08 03:39:25.033340 master-0 kubenswrapper[13046]: I0308 03:39:25.024259 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8901e3b591ad98b4210855a52699d8c6a4c5cdd4d8c13e166afaf3b7d6942e1" Mar 08 03:39:25.033340 master-0 kubenswrapper[13046]: I0308 03:39:25.024330 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6x7dc" Mar 08 03:39:25.033340 master-0 kubenswrapper[13046]: I0308 03:39:25.024706 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88" (OuterVolumeSpecName: "kube-api-access-6lc88") pod "a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" (UID: "a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11"). InnerVolumeSpecName "kube-api-access-6lc88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:25.066576 master-0 kubenswrapper[13046]: I0308 03:39:25.066527 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xds8t" Mar 08 03:39:25.066862 master-0 kubenswrapper[13046]: I0308 03:39:25.066651 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xds8t" event={"ID":"7b8ddcef-7329-4111-b9bd-c4ce0d6bf682","Type":"ContainerDied","Data":"f8b5c9080525db5c91440b55c8914e964a36e677e958e0f0e4481fb016a81eeb"} Mar 08 03:39:25.066862 master-0 kubenswrapper[13046]: I0308 03:39:25.066763 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8b5c9080525db5c91440b55c8914e964a36e677e958e0f0e4481fb016a81eeb" Mar 08 03:39:25.074178 master-0 kubenswrapper[13046]: I0308 03:39:25.070906 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" event={"ID":"20c587ce-777a-463e-877b-ada22b560d4c","Type":"ContainerStarted","Data":"b641a36155cb3df4ffedac4ba8a2c997d5cd5a94be70e8d37cb741f925591a10"} Mar 08 03:39:25.112053 master-0 kubenswrapper[13046]: I0308 03:39:25.110500 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d72vc\" (UniqueName: \"kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc\") pod \"db81925e-2987-49d2-a511-33a9f40ddd8c\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " Mar 08 03:39:25.112053 master-0 kubenswrapper[13046]: I0308 03:39:25.110624 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts\") pod \"db81925e-2987-49d2-a511-33a9f40ddd8c\" (UID: \"db81925e-2987-49d2-a511-33a9f40ddd8c\") " Mar 08 03:39:25.112053 master-0 kubenswrapper[13046]: I0308 03:39:25.111334 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lc88\" (UniqueName: \"kubernetes.io/projected/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11-kube-api-access-6lc88\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:25.120821 master-0 kubenswrapper[13046]: I0308 03:39:25.118229 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db81925e-2987-49d2-a511-33a9f40ddd8c" (UID: "db81925e-2987-49d2-a511-33a9f40ddd8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:25.144722 master-0 kubenswrapper[13046]: I0308 03:39:25.144661 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc" (OuterVolumeSpecName: "kube-api-access-d72vc") pod "db81925e-2987-49d2-a511-33a9f40ddd8c" (UID: "db81925e-2987-49d2-a511-33a9f40ddd8c"). InnerVolumeSpecName "kube-api-access-d72vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:25.213953 master-0 kubenswrapper[13046]: I0308 03:39:25.213011 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d72vc\" (UniqueName: \"kubernetes.io/projected/db81925e-2987-49d2-a511-33a9f40ddd8c-kube-api-access-d72vc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:25.213953 master-0 kubenswrapper[13046]: I0308 03:39:25.213052 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db81925e-2987-49d2-a511-33a9f40ddd8c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:25.293710 master-0 kubenswrapper[13046]: I0308 03:39:25.292634 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:26.044958 master-0 kubenswrapper[13046]: I0308 03:39:26.043901 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:26.049787 master-0 kubenswrapper[13046]: I0308 03:39:26.049617 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:26.055241 master-0 kubenswrapper[13046]: I0308 03:39:26.055198 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:26.095739 master-0 kubenswrapper[13046]: I0308 03:39:26.095377 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-65fa-account-create-update-rktss" event={"ID":"5f9dfc00-d4d2-454d-b76f-076660d4f9e2","Type":"ContainerDied","Data":"ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d"} Mar 08 03:39:26.095739 master-0 kubenswrapper[13046]: I0308 03:39:26.095430 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce9c07d4825aeb32786f0980d1ce2be3d0790f02dfb3c53a8cbb905047f3973d" Mar 08 03:39:26.095739 master-0 kubenswrapper[13046]: I0308 03:39:26.095562 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-65fa-account-create-update-rktss" Mar 08 03:39:26.108248 master-0 kubenswrapper[13046]: I0308 03:39:26.108182 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" event={"ID":"38bc77fe-5af2-4fe1-b7d5-321250be828e","Type":"ContainerDied","Data":"3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6"} Mar 08 03:39:26.108248 master-0 kubenswrapper[13046]: I0308 03:39:26.108232 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f5976359ed5966dc73ec942c57af1f4574c67d661dc106afea78a8fb354f6a6" Mar 08 03:39:26.108524 master-0 kubenswrapper[13046]: I0308 03:39:26.108267 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4cfc-account-create-update-sf2rb" Mar 08 03:39:26.110584 master-0 kubenswrapper[13046]: I0308 03:39:26.110546 13046 generic.go:334] "Generic (PLEG): container finished" podID="20c587ce-777a-463e-877b-ada22b560d4c" containerID="4739e085fc55b85aa30b9a7652d9fa14ea69654f8e9802bb852abe2d1f2b0955" exitCode=0 Mar 08 03:39:26.110685 master-0 kubenswrapper[13046]: I0308 03:39:26.110620 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" event={"ID":"20c587ce-777a-463e-877b-ada22b560d4c","Type":"ContainerDied","Data":"4739e085fc55b85aa30b9a7652d9fa14ea69654f8e9802bb852abe2d1f2b0955"} Mar 08 03:39:26.113108 master-0 kubenswrapper[13046]: I0308 03:39:26.113075 13046 generic.go:334] "Generic (PLEG): container finished" podID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerID="c761151777221b6ff32d7ff4667d01dd8c58b3604eee6189ea8012e808a12c76" exitCode=0 Mar 08 03:39:26.113241 master-0 kubenswrapper[13046]: I0308 03:39:26.113222 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"98a9871e-f04f-49db-a626-5ca80990f3c9","Type":"ContainerDied","Data":"c761151777221b6ff32d7ff4667d01dd8c58b3604eee6189ea8012e808a12c76"} Mar 08 03:39:26.113344 master-0 kubenswrapper[13046]: I0308 03:39:26.113326 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"98a9871e-f04f-49db-a626-5ca80990f3c9","Type":"ContainerStarted","Data":"b1398ccc0b85990bb6e4cc3b7eeae86db81d42894e271c3d612788106638afd4"} Mar 08 03:39:26.128657 master-0 kubenswrapper[13046]: I0308 03:39:26.126197 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2420-account-create-update-mb88p" Mar 08 03:39:26.193414 master-0 kubenswrapper[13046]: I0308 03:39:26.193353 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2420-account-create-update-mb88p" event={"ID":"2f3459cf-9e53-466c-906d-6a9033b782f1","Type":"ContainerDied","Data":"eeab13b27c71160d81630fab08f9be3b6945f8ac8ebd20056cc012cea0a993c7"} Mar 08 03:39:26.193414 master-0 kubenswrapper[13046]: I0308 03:39:26.193414 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeab13b27c71160d81630fab08f9be3b6945f8ac8ebd20056cc012cea0a993c7" Mar 08 03:39:26.250693 master-0 kubenswrapper[13046]: I0308 03:39:26.250639 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fmvc\" (UniqueName: \"kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc\") pod \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " Mar 08 03:39:26.250807 master-0 kubenswrapper[13046]: I0308 03:39:26.250742 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts\") pod \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\" (UID: \"5f9dfc00-d4d2-454d-b76f-076660d4f9e2\") " Mar 08 03:39:26.250807 master-0 kubenswrapper[13046]: I0308 03:39:26.250770 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts\") pod \"38bc77fe-5af2-4fe1-b7d5-321250be828e\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " Mar 08 03:39:26.250935 master-0 kubenswrapper[13046]: I0308 03:39:26.250910 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svhtn\" (UniqueName: \"kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn\") pod \"38bc77fe-5af2-4fe1-b7d5-321250be828e\" (UID: \"38bc77fe-5af2-4fe1-b7d5-321250be828e\") " Mar 08 03:39:26.250992 master-0 kubenswrapper[13046]: I0308 03:39:26.250980 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts\") pod \"2f3459cf-9e53-466c-906d-6a9033b782f1\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " Mar 08 03:39:26.251049 master-0 kubenswrapper[13046]: I0308 03:39:26.251025 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448bw\" (UniqueName: \"kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw\") pod \"2f3459cf-9e53-466c-906d-6a9033b782f1\" (UID: \"2f3459cf-9e53-466c-906d-6a9033b782f1\") " Mar 08 03:39:26.251570 master-0 kubenswrapper[13046]: I0308 03:39:26.251509 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f9dfc00-d4d2-454d-b76f-076660d4f9e2" (UID: "5f9dfc00-d4d2-454d-b76f-076660d4f9e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:26.252418 master-0 kubenswrapper[13046]: I0308 03:39:26.252375 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f3459cf-9e53-466c-906d-6a9033b782f1" (UID: "2f3459cf-9e53-466c-906d-6a9033b782f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:26.252590 master-0 kubenswrapper[13046]: I0308 03:39:26.252545 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38bc77fe-5af2-4fe1-b7d5-321250be828e" (UID: "38bc77fe-5af2-4fe1-b7d5-321250be828e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:26.255949 master-0 kubenswrapper[13046]: I0308 03:39:26.255886 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn" (OuterVolumeSpecName: "kube-api-access-svhtn") pod "38bc77fe-5af2-4fe1-b7d5-321250be828e" (UID: "38bc77fe-5af2-4fe1-b7d5-321250be828e"). InnerVolumeSpecName "kube-api-access-svhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:26.256239 master-0 kubenswrapper[13046]: I0308 03:39:26.256209 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw" (OuterVolumeSpecName: "kube-api-access-448bw") pod "2f3459cf-9e53-466c-906d-6a9033b782f1" (UID: "2f3459cf-9e53-466c-906d-6a9033b782f1"). InnerVolumeSpecName "kube-api-access-448bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:26.261839 master-0 kubenswrapper[13046]: I0308 03:39:26.261771 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc" (OuterVolumeSpecName: "kube-api-access-6fmvc") pod "5f9dfc00-d4d2-454d-b76f-076660d4f9e2" (UID: "5f9dfc00-d4d2-454d-b76f-076660d4f9e2"). InnerVolumeSpecName "kube-api-access-6fmvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354350 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354401 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38bc77fe-5af2-4fe1-b7d5-321250be828e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354414 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svhtn\" (UniqueName: \"kubernetes.io/projected/38bc77fe-5af2-4fe1-b7d5-321250be828e-kube-api-access-svhtn\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354429 13046 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f3459cf-9e53-466c-906d-6a9033b782f1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354440 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448bw\" (UniqueName: \"kubernetes.io/projected/2f3459cf-9e53-466c-906d-6a9033b782f1-kube-api-access-448bw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:26.358506 master-0 kubenswrapper[13046]: I0308 03:39:26.354452 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fmvc\" (UniqueName: \"kubernetes.io/projected/5f9dfc00-d4d2-454d-b76f-076660d4f9e2-kube-api-access-6fmvc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:27.143412 master-0 kubenswrapper[13046]: I0308 03:39:27.143359 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" event={"ID":"20c587ce-777a-463e-877b-ada22b560d4c","Type":"ContainerStarted","Data":"9ae6a1821890e3ce840e0fc548870c8c23bf2e129aee66ef2fa0e5b43bb3524e"} Mar 08 03:39:27.143904 master-0 kubenswrapper[13046]: I0308 03:39:27.143559 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:27.316696 master-0 kubenswrapper[13046]: I0308 03:39:27.316648 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f97c05dc-9bae-44b7-bf68-647f2e5a82fd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^576e91b7-8ab0-40de-9966-656c3360fc82\") pod \"glance-bf784-default-external-api-0\" (UID: \"98c075f3-f193-415e-a94d-d5ee77a6738b\") " pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:27.746216 master-0 kubenswrapper[13046]: I0308 03:39:27.746149 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:28.073351 master-0 kubenswrapper[13046]: I0308 03:39:28.073302 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:28.109124 master-0 kubenswrapper[13046]: I0308 03:39:28.108274 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-765dd9b47b-xzxwf" Mar 08 03:39:28.134885 master-0 kubenswrapper[13046]: I0308 03:39:28.134649 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" podStartSLOduration=5.134622943 podStartE2EDuration="5.134622943s" podCreationTimestamp="2026-03-08 03:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:28.106385591 +0000 UTC m=+1570.185152808" watchObservedRunningTime="2026-03-08 03:39:28.134622943 +0000 UTC m=+1570.213390160" Mar 08 03:39:28.213594 master-0 kubenswrapper[13046]: I0308 03:39:28.209472 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a6c3dbd-1c48-429d-b579-5d2e0d41334c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^409f10c3-7eb1-40c5-ae89-ffb30196d7c0\") pod \"glance-bf784-default-internal-api-0\" (UID: \"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d\") " pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:28.288331 master-0 kubenswrapper[13046]: I0308 03:39:28.286596 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:28.414329 master-0 kubenswrapper[13046]: W0308 03:39:28.410965 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c075f3_f193_415e_a94d_d5ee77a6738b.slice/crio-fa6bbdc4824c6225d7e38fd4914bb9760bbec442d836521b755d6d09d5765596 WatchSource:0}: Error finding container fa6bbdc4824c6225d7e38fd4914bb9760bbec442d836521b755d6d09d5765596: Status 404 returned error can't find the container with id fa6bbdc4824c6225d7e38fd4914bb9760bbec442d836521b755d6d09d5765596 Mar 08 03:39:28.414329 master-0 kubenswrapper[13046]: I0308 03:39:28.413025 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-external-api-0"] Mar 08 03:39:28.962905 master-0 kubenswrapper[13046]: I0308 03:39:28.962850 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf784-default-internal-api-0"] Mar 08 03:39:28.964796 master-0 kubenswrapper[13046]: W0308 03:39:28.964458 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dd0b908_1f05_4fca_89a8_b6eb1f41c33d.slice/crio-1d2c0766c62614bc0a5e3281433c53325fcc74f158fabfbd727f6a03d1e69dfc WatchSource:0}: Error finding container 1d2c0766c62614bc0a5e3281433c53325fcc74f158fabfbd727f6a03d1e69dfc: Status 404 returned error can't find the container with id 1d2c0766c62614bc0a5e3281433c53325fcc74f158fabfbd727f6a03d1e69dfc Mar 08 03:39:29.071881 master-0 kubenswrapper[13046]: I0308 03:39:29.068275 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:29.285930 master-0 kubenswrapper[13046]: I0308 03:39:29.285875 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"98c075f3-f193-415e-a94d-d5ee77a6738b","Type":"ContainerStarted","Data":"fa6bbdc4824c6225d7e38fd4914bb9760bbec442d836521b755d6d09d5765596"} Mar 08 03:39:29.287544 master-0 kubenswrapper[13046]: I0308 03:39:29.287511 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d","Type":"ContainerStarted","Data":"1d2c0766c62614bc0a5e3281433c53325fcc74f158fabfbd727f6a03d1e69dfc"} Mar 08 03:39:29.289807 master-0 kubenswrapper[13046]: I0308 03:39:29.289775 13046 generic.go:334] "Generic (PLEG): container finished" podID="528b1064-a3b2-4ea4-8584-abeffdbedbbe" containerID="d79584f6452ae35aeeae3f0d940aaa6c96f4e9af7cf059a9a950b56d7c3263dd" exitCode=0 Mar 08 03:39:29.289881 master-0 kubenswrapper[13046]: I0308 03:39:29.289805 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerDied","Data":"d79584f6452ae35aeeae3f0d940aaa6c96f4e9af7cf059a9a950b56d7c3263dd"} Mar 08 03:39:30.306731 master-0 kubenswrapper[13046]: I0308 03:39:30.306581 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"98c075f3-f193-415e-a94d-d5ee77a6738b","Type":"ContainerStarted","Data":"b8908413aaf46e7645c3f01424c3bbf60898eb005ebafd66464e1d0341085e35"} Mar 08 03:39:30.306731 master-0 kubenswrapper[13046]: I0308 03:39:30.306635 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-external-api-0" event={"ID":"98c075f3-f193-415e-a94d-d5ee77a6738b","Type":"ContainerStarted","Data":"bbb292156d746cca1b25d952e6b7732687eab38a39cf6cc9d93db2c6b08bfe30"} Mar 08 03:39:30.308245 master-0 kubenswrapper[13046]: I0308 03:39:30.308192 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d","Type":"ContainerStarted","Data":"37bc889477be482293dc8c229235c4953c7f0af465eed4894c403c0db0f97b95"} Mar 08 03:39:30.366216 master-0 kubenswrapper[13046]: I0308 03:39:30.365917 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-external-api-0" podStartSLOduration=7.365896136 podStartE2EDuration="7.365896136s" podCreationTimestamp="2026-03-08 03:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:30.35055813 +0000 UTC m=+1572.429325347" watchObservedRunningTime="2026-03-08 03:39:30.365896136 +0000 UTC m=+1572.444663353" Mar 08 03:39:30.815110 master-0 kubenswrapper[13046]: I0308 03:39:30.815063 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-9f967cb96-7vpvw" Mar 08 03:39:32.699562 master-0 kubenswrapper[13046]: I0308 03:39:32.699180 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ckdg7"] Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699852 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9dfc00-d4d2-454d-b76f-076660d4f9e2" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699868 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9dfc00-d4d2-454d-b76f-076660d4f9e2" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699896 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3459cf-9e53-466c-906d-6a9033b782f1" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699902 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3459cf-9e53-466c-906d-6a9033b782f1" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699920 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bc77fe-5af2-4fe1-b7d5-321250be828e" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699927 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bc77fe-5af2-4fe1-b7d5-321250be828e" containerName="mariadb-account-create-update" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699937 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db81925e-2987-49d2-a511-33a9f40ddd8c" containerName="mariadb-database-create" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699943 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="db81925e-2987-49d2-a511-33a9f40ddd8c" containerName="mariadb-database-create" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699967 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" containerName="mariadb-database-create" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699973 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" containerName="mariadb-database-create" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: E0308 03:39:32.699992 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" containerName="mariadb-database-create" Mar 08 03:39:32.700017 master-0 kubenswrapper[13046]: I0308 03:39:32.699999 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" containerName="mariadb-database-create" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700206 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3459cf-9e53-466c-906d-6a9033b782f1" containerName="mariadb-account-create-update" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700251 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" containerName="mariadb-database-create" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700266 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bc77fe-5af2-4fe1-b7d5-321250be828e" containerName="mariadb-account-create-update" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700280 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="db81925e-2987-49d2-a511-33a9f40ddd8c" containerName="mariadb-database-create" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700291 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9dfc00-d4d2-454d-b76f-076660d4f9e2" containerName="mariadb-account-create-update" Mar 08 03:39:32.700363 master-0 kubenswrapper[13046]: I0308 03:39:32.700304 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" containerName="mariadb-database-create" Mar 08 03:39:32.701017 master-0 kubenswrapper[13046]: I0308 03:39:32.700992 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.703377 master-0 kubenswrapper[13046]: I0308 03:39:32.703337 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 03:39:32.703606 master-0 kubenswrapper[13046]: I0308 03:39:32.703586 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 03:39:32.742916 master-0 kubenswrapper[13046]: I0308 03:39:32.742435 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ckdg7"] Mar 08 03:39:32.892257 master-0 kubenswrapper[13046]: I0308 03:39:32.892141 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.892474 master-0 kubenswrapper[13046]: I0308 03:39:32.892438 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.892801 master-0 kubenswrapper[13046]: I0308 03:39:32.892785 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv8mg\" (UniqueName: \"kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.893007 master-0 kubenswrapper[13046]: I0308 03:39:32.892992 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.994991 master-0 kubenswrapper[13046]: I0308 03:39:32.994877 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.994991 master-0 kubenswrapper[13046]: I0308 03:39:32.994935 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.995193 master-0 kubenswrapper[13046]: I0308 03:39:32.995030 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv8mg\" (UniqueName: \"kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.995193 master-0 kubenswrapper[13046]: I0308 03:39:32.995093 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.998452 master-0 kubenswrapper[13046]: I0308 03:39:32.998303 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:32.999234 master-0 kubenswrapper[13046]: I0308 03:39:32.999176 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:33.001248 master-0 kubenswrapper[13046]: I0308 03:39:33.001213 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:33.010802 master-0 kubenswrapper[13046]: I0308 03:39:33.010751 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv8mg\" (UniqueName: \"kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg\") pod \"nova-cell0-conductor-db-sync-ckdg7\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:33.046506 master-0 kubenswrapper[13046]: I0308 03:39:33.046302 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:39:33.413508 master-0 kubenswrapper[13046]: I0308 03:39:33.412810 13046 generic.go:334] "Generic (PLEG): container finished" podID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerID="e9301bafd173bd555c29d451b9f1a50677d884a8bad1eaff371f0025aa15404d" exitCode=0 Mar 08 03:39:33.413508 master-0 kubenswrapper[13046]: I0308 03:39:33.412887 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"98a9871e-f04f-49db-a626-5ca80990f3c9","Type":"ContainerDied","Data":"e9301bafd173bd555c29d451b9f1a50677d884a8bad1eaff371f0025aa15404d"} Mar 08 03:39:33.420499 master-0 kubenswrapper[13046]: I0308 03:39:33.415798 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"a3f39704bbe83d757e07514378179093996ce1eec717a8549e5e52edc164a630"} Mar 08 03:39:33.424500 master-0 kubenswrapper[13046]: I0308 03:39:33.421110 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf784-default-internal-api-0" event={"ID":"3dd0b908-1f05-4fca-89a8-b6eb1f41c33d","Type":"ContainerStarted","Data":"d53b11bcf49aecd7e76bb3e873c233c50ac8a478ff2a7e00d84e23733a45000c"} Mar 08 03:39:33.682321 master-0 kubenswrapper[13046]: W0308 03:39:33.681683 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod510d6b1b_5eec_47ab_ba92_23ef35ec5f83.slice/crio-cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a WatchSource:0}: Error finding container cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a: Status 404 returned error can't find the container with id cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a Mar 08 03:39:33.713944 master-0 kubenswrapper[13046]: I0308 03:39:33.713892 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ckdg7"] Mar 08 03:39:33.715942 master-0 kubenswrapper[13046]: I0308 03:39:33.715874 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bf784-default-internal-api-0" podStartSLOduration=10.715854554 podStartE2EDuration="10.715854554s" podCreationTimestamp="2026-03-08 03:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:33.682060384 +0000 UTC m=+1575.760827611" watchObservedRunningTime="2026-03-08 03:39:33.715854554 +0000 UTC m=+1575.794621771" Mar 08 03:39:34.007089 master-0 kubenswrapper[13046]: I0308 03:39:34.006854 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:39:34.274398 master-0 kubenswrapper[13046]: I0308 03:39:34.274363 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:34.283772 master-0 kubenswrapper[13046]: I0308 03:39:34.283552 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:39:34.283914 master-0 kubenswrapper[13046]: I0308 03:39:34.283851 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="dnsmasq-dns" containerID="cri-o://3c0972bc923fef035427b96163d52c64dcc7330036f87132230479f40575b42c" gracePeriod=10 Mar 08 03:39:34.435402 master-0 kubenswrapper[13046]: I0308 03:39:34.435359 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.435633 master-0 kubenswrapper[13046]: I0308 03:39:34.435577 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.435633 master-0 kubenswrapper[13046]: I0308 03:39:34.435610 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlzw7\" (UniqueName: \"kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.435722 master-0 kubenswrapper[13046]: I0308 03:39:34.435648 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.435722 master-0 kubenswrapper[13046]: I0308 03:39:34.435714 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.435792 master-0 kubenswrapper[13046]: I0308 03:39:34.435743 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.436083 master-0 kubenswrapper[13046]: I0308 03:39:34.435864 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config\") pod \"98a9871e-f04f-49db-a626-5ca80990f3c9\" (UID: \"98a9871e-f04f-49db-a626-5ca80990f3c9\") " Mar 08 03:39:34.437100 master-0 kubenswrapper[13046]: I0308 03:39:34.437049 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:34.439463 master-0 kubenswrapper[13046]: I0308 03:39:34.439406 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts" (OuterVolumeSpecName: "scripts") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:34.442707 master-0 kubenswrapper[13046]: I0308 03:39:34.442623 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7" (OuterVolumeSpecName: "kube-api-access-nlzw7") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "kube-api-access-nlzw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:34.442707 master-0 kubenswrapper[13046]: I0308 03:39:34.442633 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 03:39:34.443651 master-0 kubenswrapper[13046]: I0308 03:39:34.443621 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config" (OuterVolumeSpecName: "config") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:34.444635 master-0 kubenswrapper[13046]: I0308 03:39:34.444537 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:39:34.447885 master-0 kubenswrapper[13046]: I0308 03:39:34.447845 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" event={"ID":"510d6b1b-5eec-47ab-ba92-23ef35ec5f83","Type":"ContainerStarted","Data":"cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a"} Mar 08 03:39:34.456875 master-0 kubenswrapper[13046]: I0308 03:39:34.456826 13046 generic.go:334] "Generic (PLEG): container finished" podID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerID="3c0972bc923fef035427b96163d52c64dcc7330036f87132230479f40575b42c" exitCode=0 Mar 08 03:39:34.457014 master-0 kubenswrapper[13046]: I0308 03:39:34.456893 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" event={"ID":"fbcbe677-7983-4b97-9146-604451f6b8d6","Type":"ContainerDied","Data":"3c0972bc923fef035427b96163d52c64dcc7330036f87132230479f40575b42c"} Mar 08 03:39:34.460729 master-0 kubenswrapper[13046]: I0308 03:39:34.460670 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"98a9871e-f04f-49db-a626-5ca80990f3c9","Type":"ContainerDied","Data":"b1398ccc0b85990bb6e4cc3b7eeae86db81d42894e271c3d612788106638afd4"} Mar 08 03:39:34.460790 master-0 kubenswrapper[13046]: I0308 03:39:34.460766 13046 scope.go:117] "RemoveContainer" containerID="e9301bafd173bd555c29d451b9f1a50677d884a8bad1eaff371f0025aa15404d" Mar 08 03:39:34.461046 master-0 kubenswrapper[13046]: I0308 03:39:34.461003 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:34.512896 master-0 kubenswrapper[13046]: I0308 03:39:34.512789 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98a9871e-f04f-49db-a626-5ca80990f3c9" (UID: "98a9871e-f04f-49db-a626-5ca80990f3c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540342 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540380 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540393 13046 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98a9871e-f04f-49db-a626-5ca80990f3c9-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540402 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540412 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlzw7\" (UniqueName: \"kubernetes.io/projected/98a9871e-f04f-49db-a626-5ca80990f3c9-kube-api-access-nlzw7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540422 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98a9871e-f04f-49db-a626-5ca80990f3c9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.542284 master-0 kubenswrapper[13046]: I0308 03:39:34.540430 13046 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98a9871e-f04f-49db-a626-5ca80990f3c9-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:34.609159 master-0 kubenswrapper[13046]: I0308 03:39:34.609117 13046 scope.go:117] "RemoveContainer" containerID="c761151777221b6ff32d7ff4667d01dd8c58b3604eee6189ea8012e808a12c76" Mar 08 03:39:34.893498 master-0 kubenswrapper[13046]: I0308 03:39:34.888573 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:34.905134 master-0 kubenswrapper[13046]: I0308 03:39:34.904406 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: I0308 03:39:34.918197 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: E0308 03:39:34.918837 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerName="inspector-pxe-init" Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: I0308 03:39:34.918859 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerName="inspector-pxe-init" Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: E0308 03:39:34.918906 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerName="ironic-python-agent-init" Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: I0308 03:39:34.918914 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerName="ironic-python-agent-init" Mar 08 03:39:34.931523 master-0 kubenswrapper[13046]: I0308 03:39:34.920286 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" containerName="inspector-pxe-init" Mar 08 03:39:34.942681 master-0 kubenswrapper[13046]: I0308 03:39:34.932612 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:34.942681 master-0 kubenswrapper[13046]: I0308 03:39:34.937600 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 08 03:39:34.942681 master-0 kubenswrapper[13046]: I0308 03:39:34.937852 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 08 03:39:34.942681 master-0 kubenswrapper[13046]: I0308 03:39:34.937877 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 08 03:39:34.942681 master-0 kubenswrapper[13046]: I0308 03:39:34.938123 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 08 03:39:34.956415 master-0 kubenswrapper[13046]: I0308 03:39:34.955063 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 08 03:39:34.977515 master-0 kubenswrapper[13046]: I0308 03:39:34.972063 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:34.989553 master-0 kubenswrapper[13046]: I0308 03:39:34.978047 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:39:35.058270 master-0 kubenswrapper[13046]: I0308 03:39:35.056236 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hz7t\" (UniqueName: \"kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.058270 master-0 kubenswrapper[13046]: I0308 03:39:35.056410 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.058270 master-0 kubenswrapper[13046]: I0308 03:39:35.056443 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.069514 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.069713 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.069806 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0\") pod \"fbcbe677-7983-4b97-9146-604451f6b8d6\" (UID: \"fbcbe677-7983-4b97-9146-604451f6b8d6\") " Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070343 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa21680e-6931-4e05-afe2-bceebbb4389e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070599 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-config\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070636 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wl5\" (UniqueName: \"kubernetes.io/projected/fa21680e-6931-4e05-afe2-bceebbb4389e-kube-api-access-g2wl5\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070669 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070741 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-scripts\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070892 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.070942 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.071078 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.072717 master-0 kubenswrapper[13046]: I0308 03:39:35.071156 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.114494 master-0 kubenswrapper[13046]: I0308 03:39:35.111243 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t" (OuterVolumeSpecName: "kube-api-access-6hz7t") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "kube-api-access-6hz7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:39:35.130525 master-0 kubenswrapper[13046]: I0308 03:39:35.128562 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:35.134946 master-0 kubenswrapper[13046]: I0308 03:39:35.133474 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:35.154794 master-0 kubenswrapper[13046]: I0308 03:39:35.154266 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:35.173455 master-0 kubenswrapper[13046]: I0308 03:39:35.173397 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-config\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.173455 master-0 kubenswrapper[13046]: I0308 03:39:35.173450 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wl5\" (UniqueName: \"kubernetes.io/projected/fa21680e-6931-4e05-afe2-bceebbb4389e-kube-api-access-g2wl5\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.173684 master-0 kubenswrapper[13046]: I0308 03:39:35.173507 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.173684 master-0 kubenswrapper[13046]: I0308 03:39:35.173547 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-scripts\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.174182 master-0 kubenswrapper[13046]: I0308 03:39:35.174158 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.174242 master-0 kubenswrapper[13046]: I0308 03:39:35.174208 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.175631 master-0 kubenswrapper[13046]: I0308 03:39:35.175220 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176072 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176102 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176120 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa21680e-6931-4e05-afe2-bceebbb4389e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176350 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hz7t\" (UniqueName: \"kubernetes.io/projected/fbcbe677-7983-4b97-9146-604451f6b8d6-kube-api-access-6hz7t\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176366 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176378 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176389 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.176613 master-0 kubenswrapper[13046]: I0308 03:39:35.176587 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/fa21680e-6931-4e05-afe2-bceebbb4389e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.177187 master-0 kubenswrapper[13046]: I0308 03:39:35.177140 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-scripts\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.188775 master-0 kubenswrapper[13046]: I0308 03:39:35.188730 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.189095 master-0 kubenswrapper[13046]: I0308 03:39:35.189061 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.189975 master-0 kubenswrapper[13046]: I0308 03:39:35.189919 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:35.190763 master-0 kubenswrapper[13046]: I0308 03:39:35.190722 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-config\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.191238 master-0 kubenswrapper[13046]: I0308 03:39:35.191205 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa21680e-6931-4e05-afe2-bceebbb4389e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.191402 master-0 kubenswrapper[13046]: I0308 03:39:35.191370 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wl5\" (UniqueName: \"kubernetes.io/projected/fa21680e-6931-4e05-afe2-bceebbb4389e-kube-api-access-g2wl5\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.192152 master-0 kubenswrapper[13046]: I0308 03:39:35.192120 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config" (OuterVolumeSpecName: "config") pod "fbcbe677-7983-4b97-9146-604451f6b8d6" (UID: "fbcbe677-7983-4b97-9146-604451f6b8d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:39:35.193195 master-0 kubenswrapper[13046]: I0308 03:39:35.193158 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/fa21680e-6931-4e05-afe2-bceebbb4389e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"fa21680e-6931-4e05-afe2-bceebbb4389e\") " pod="openstack/ironic-inspector-0" Mar 08 03:39:35.289741 master-0 kubenswrapper[13046]: I0308 03:39:35.264586 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 08 03:39:35.293577 master-0 kubenswrapper[13046]: I0308 03:39:35.293518 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.293686 master-0 kubenswrapper[13046]: I0308 03:39:35.293670 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbcbe677-7983-4b97-9146-604451f6b8d6-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:39:35.492119 master-0 kubenswrapper[13046]: I0308 03:39:35.492071 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" event={"ID":"fbcbe677-7983-4b97-9146-604451f6b8d6","Type":"ContainerDied","Data":"dde5bfccb5356bf723103cc57a35fcf6cf372fd6dcaa3c73ba383e370431e0f2"} Mar 08 03:39:35.492119 master-0 kubenswrapper[13046]: I0308 03:39:35.492123 13046 scope.go:117] "RemoveContainer" containerID="3c0972bc923fef035427b96163d52c64dcc7330036f87132230479f40575b42c" Mar 08 03:39:35.492268 master-0 kubenswrapper[13046]: I0308 03:39:35.492224 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85fc44959f-w8nqq" Mar 08 03:39:35.559604 master-0 kubenswrapper[13046]: I0308 03:39:35.559415 13046 scope.go:117] "RemoveContainer" containerID="4bdb6aac0720cf11b371222ad04927a426b562a9802427436c2e798dd12f9f70" Mar 08 03:39:35.568716 master-0 kubenswrapper[13046]: I0308 03:39:35.567680 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:39:35.593786 master-0 kubenswrapper[13046]: I0308 03:39:35.593575 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85fc44959f-w8nqq"] Mar 08 03:39:36.071226 master-0 kubenswrapper[13046]: I0308 03:39:36.071172 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 08 03:39:36.073415 master-0 kubenswrapper[13046]: W0308 03:39:36.073350 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa21680e_6931_4e05_afe2_bceebbb4389e.slice/crio-3a772b5648dbba7ae260dceea27a3c9e043a76ee0e46a7047b9c86328c9a8b02 WatchSource:0}: Error finding container 3a772b5648dbba7ae260dceea27a3c9e043a76ee0e46a7047b9c86328c9a8b02: Status 404 returned error can't find the container with id 3a772b5648dbba7ae260dceea27a3c9e043a76ee0e46a7047b9c86328c9a8b02 Mar 08 03:39:36.135573 master-0 kubenswrapper[13046]: I0308 03:39:36.135513 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98a9871e-f04f-49db-a626-5ca80990f3c9" path="/var/lib/kubelet/pods/98a9871e-f04f-49db-a626-5ca80990f3c9/volumes" Mar 08 03:39:36.136695 master-0 kubenswrapper[13046]: I0308 03:39:36.136628 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" path="/var/lib/kubelet/pods/fbcbe677-7983-4b97-9146-604451f6b8d6/volumes" Mar 08 03:39:36.506093 master-0 kubenswrapper[13046]: I0308 03:39:36.506033 13046 generic.go:334] "Generic (PLEG): container finished" podID="fa21680e-6931-4e05-afe2-bceebbb4389e" containerID="18597086fd0b996b1e64ca56d50a50bec3af7e8e5385d4473348d9eacc108c05" exitCode=0 Mar 08 03:39:36.506093 master-0 kubenswrapper[13046]: I0308 03:39:36.506094 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerDied","Data":"18597086fd0b996b1e64ca56d50a50bec3af7e8e5385d4473348d9eacc108c05"} Mar 08 03:39:36.506405 master-0 kubenswrapper[13046]: I0308 03:39:36.506120 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"3a772b5648dbba7ae260dceea27a3c9e043a76ee0e46a7047b9c86328c9a8b02"} Mar 08 03:39:37.533680 master-0 kubenswrapper[13046]: I0308 03:39:37.533633 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"0c4c7c5eb815ed702b096065c50f4afa1ec85339981a75f7b55395d819c7d1e1"} Mar 08 03:39:37.747257 master-0 kubenswrapper[13046]: I0308 03:39:37.746548 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:37.749413 master-0 kubenswrapper[13046]: I0308 03:39:37.748867 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:37.792220 master-0 kubenswrapper[13046]: I0308 03:39:37.791721 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:37.795550 master-0 kubenswrapper[13046]: I0308 03:39:37.795148 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:38.289731 master-0 kubenswrapper[13046]: I0308 03:39:38.287350 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.289731 master-0 kubenswrapper[13046]: I0308 03:39:38.287412 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.322862 master-0 kubenswrapper[13046]: I0308 03:39:38.322701 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.347452 master-0 kubenswrapper[13046]: I0308 03:39:38.347388 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.554371 13046 generic.go:334] "Generic (PLEG): container finished" podID="fa21680e-6931-4e05-afe2-bceebbb4389e" containerID="0c4c7c5eb815ed702b096065c50f4afa1ec85339981a75f7b55395d819c7d1e1" exitCode=0 Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.554460 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerDied","Data":"0c4c7c5eb815ed702b096065c50f4afa1ec85339981a75f7b55395d819c7d1e1"} Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.554559 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"ebf48fc58e0de38af0a785f6542d586f2b3d14123a9baeae9ee3077f716b9d09"} Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.555280 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.555323 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.555334 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:38.555344 master-0 kubenswrapper[13046]: I0308 03:39:38.555344 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:39.576162 master-0 kubenswrapper[13046]: I0308 03:39:39.576093 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"50721a7f9665cad8824d76e1e0ecc7f992a3c444d5508af749837c7f4d5f332a"} Mar 08 03:39:41.970065 master-0 kubenswrapper[13046]: I0308 03:39:41.970019 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:41.970704 master-0 kubenswrapper[13046]: I0308 03:39:41.970113 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:39:41.982430 master-0 kubenswrapper[13046]: I0308 03:39:41.982385 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-internal-api-0" Mar 08 03:39:41.984008 master-0 kubenswrapper[13046]: I0308 03:39:41.983916 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:41.984097 master-0 kubenswrapper[13046]: I0308 03:39:41.984040 13046 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 03:39:41.992116 master-0 kubenswrapper[13046]: I0308 03:39:41.992075 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-bf784-default-external-api-0" Mar 08 03:39:45.800507 master-0 kubenswrapper[13046]: I0308 03:39:45.798657 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"d3576404426f597ef6dc8c61b55c3e204a055583848c5598039198151fdcda3f"} Mar 08 03:39:45.805364 master-0 kubenswrapper[13046]: I0308 03:39:45.805296 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" event={"ID":"510d6b1b-5eec-47ab-ba92-23ef35ec5f83","Type":"ContainerStarted","Data":"c881448e5ca03aa12111fdc11156330c6818f43b383ccc8c8aaed17fca718b22"} Mar 08 03:39:45.967674 master-0 kubenswrapper[13046]: I0308 03:39:45.967617 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" podStartSLOduration=2.844596547 podStartE2EDuration="13.967601029s" podCreationTimestamp="2026-03-08 03:39:32 +0000 UTC" firstStartedPulling="2026-03-08 03:39:33.68404719 +0000 UTC m=+1575.762814407" lastFinishedPulling="2026-03-08 03:39:44.807051672 +0000 UTC m=+1586.885818889" observedRunningTime="2026-03-08 03:39:45.962522915 +0000 UTC m=+1588.041290132" watchObservedRunningTime="2026-03-08 03:39:45.967601029 +0000 UTC m=+1588.046368236" Mar 08 03:39:46.820973 master-0 kubenswrapper[13046]: I0308 03:39:46.820719 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"89538e7e137c2e940c7d97cd7286288695775165e945006653d85555414c1307"} Mar 08 03:39:46.820973 master-0 kubenswrapper[13046]: I0308 03:39:46.820782 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"fa21680e-6931-4e05-afe2-bceebbb4389e","Type":"ContainerStarted","Data":"cff471827c0423e0af2a8087fb03ae665c08c70d630cf324791c4b4d383d1709"} Mar 08 03:39:46.820973 master-0 kubenswrapper[13046]: I0308 03:39:46.820797 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 03:39:46.820973 master-0 kubenswrapper[13046]: I0308 03:39:46.820809 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 03:39:46.863708 master-0 kubenswrapper[13046]: I0308 03:39:46.863624 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=12.86360279 podStartE2EDuration="12.86360279s" podCreationTimestamp="2026-03-08 03:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:39:46.851526928 +0000 UTC m=+1588.930294175" watchObservedRunningTime="2026-03-08 03:39:46.86360279 +0000 UTC m=+1588.942370007" Mar 08 03:39:50.265536 master-0 kubenswrapper[13046]: I0308 03:39:50.265463 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 03:39:50.265536 master-0 kubenswrapper[13046]: I0308 03:39:50.265532 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 08 03:39:50.268563 master-0 kubenswrapper[13046]: I0308 03:39:50.268540 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 03:39:50.303821 master-0 kubenswrapper[13046]: I0308 03:39:50.303775 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.265859 master-0 kubenswrapper[13046]: I0308 03:39:55.265751 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.266504 master-0 kubenswrapper[13046]: I0308 03:39:55.265850 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.296050 master-0 kubenswrapper[13046]: I0308 03:39:55.295946 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.299545 master-0 kubenswrapper[13046]: I0308 03:39:55.299500 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.958503 master-0 kubenswrapper[13046]: I0308 03:39:55.958445 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 03:39:55.963257 master-0 kubenswrapper[13046]: I0308 03:39:55.963195 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 08 03:40:01.015356 master-0 kubenswrapper[13046]: I0308 03:40:01.015284 13046 generic.go:334] "Generic (PLEG): container finished" podID="510d6b1b-5eec-47ab-ba92-23ef35ec5f83" containerID="c881448e5ca03aa12111fdc11156330c6818f43b383ccc8c8aaed17fca718b22" exitCode=0 Mar 08 03:40:01.015356 master-0 kubenswrapper[13046]: I0308 03:40:01.015343 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" event={"ID":"510d6b1b-5eec-47ab-ba92-23ef35ec5f83","Type":"ContainerDied","Data":"c881448e5ca03aa12111fdc11156330c6818f43b383ccc8c8aaed17fca718b22"} Mar 08 03:40:02.643016 master-0 kubenswrapper[13046]: I0308 03:40:02.642953 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:40:02.696624 master-0 kubenswrapper[13046]: I0308 03:40:02.696565 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts\") pod \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " Mar 08 03:40:02.696841 master-0 kubenswrapper[13046]: I0308 03:40:02.696635 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv8mg\" (UniqueName: \"kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg\") pod \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " Mar 08 03:40:02.696841 master-0 kubenswrapper[13046]: I0308 03:40:02.696739 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle\") pod \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " Mar 08 03:40:02.696841 master-0 kubenswrapper[13046]: I0308 03:40:02.696767 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data\") pod \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\" (UID: \"510d6b1b-5eec-47ab-ba92-23ef35ec5f83\") " Mar 08 03:40:02.702675 master-0 kubenswrapper[13046]: I0308 03:40:02.702369 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts" (OuterVolumeSpecName: "scripts") pod "510d6b1b-5eec-47ab-ba92-23ef35ec5f83" (UID: "510d6b1b-5eec-47ab-ba92-23ef35ec5f83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:02.725754 master-0 kubenswrapper[13046]: I0308 03:40:02.725596 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg" (OuterVolumeSpecName: "kube-api-access-tv8mg") pod "510d6b1b-5eec-47ab-ba92-23ef35ec5f83" (UID: "510d6b1b-5eec-47ab-ba92-23ef35ec5f83"). InnerVolumeSpecName "kube-api-access-tv8mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:02.728996 master-0 kubenswrapper[13046]: I0308 03:40:02.728323 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data" (OuterVolumeSpecName: "config-data") pod "510d6b1b-5eec-47ab-ba92-23ef35ec5f83" (UID: "510d6b1b-5eec-47ab-ba92-23ef35ec5f83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:02.739779 master-0 kubenswrapper[13046]: I0308 03:40:02.739724 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "510d6b1b-5eec-47ab-ba92-23ef35ec5f83" (UID: "510d6b1b-5eec-47ab-ba92-23ef35ec5f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:02.798835 master-0 kubenswrapper[13046]: I0308 03:40:02.798776 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:02.798835 master-0 kubenswrapper[13046]: I0308 03:40:02.798828 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:02.798835 master-0 kubenswrapper[13046]: I0308 03:40:02.798837 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:02.799005 master-0 kubenswrapper[13046]: I0308 03:40:02.798850 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv8mg\" (UniqueName: \"kubernetes.io/projected/510d6b1b-5eec-47ab-ba92-23ef35ec5f83-kube-api-access-tv8mg\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:03.047024 master-0 kubenswrapper[13046]: I0308 03:40:03.043461 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" event={"ID":"510d6b1b-5eec-47ab-ba92-23ef35ec5f83","Type":"ContainerDied","Data":"cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a"} Mar 08 03:40:03.047024 master-0 kubenswrapper[13046]: I0308 03:40:03.043573 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc358aa5fb56cb9db5ffefe03f2b2b3dfcf759417d79cf3d492979b9ea26c8a" Mar 08 03:40:03.047024 master-0 kubenswrapper[13046]: I0308 03:40:03.043539 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ckdg7" Mar 08 03:40:03.167163 master-0 kubenswrapper[13046]: I0308 03:40:03.167101 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 03:40:03.167793 master-0 kubenswrapper[13046]: E0308 03:40:03.167756 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="dnsmasq-dns" Mar 08 03:40:03.167793 master-0 kubenswrapper[13046]: I0308 03:40:03.167780 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="dnsmasq-dns" Mar 08 03:40:03.167943 master-0 kubenswrapper[13046]: E0308 03:40:03.167836 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="init" Mar 08 03:40:03.167943 master-0 kubenswrapper[13046]: I0308 03:40:03.167846 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="init" Mar 08 03:40:03.167943 master-0 kubenswrapper[13046]: E0308 03:40:03.167862 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="510d6b1b-5eec-47ab-ba92-23ef35ec5f83" containerName="nova-cell0-conductor-db-sync" Mar 08 03:40:03.167943 master-0 kubenswrapper[13046]: I0308 03:40:03.167872 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="510d6b1b-5eec-47ab-ba92-23ef35ec5f83" containerName="nova-cell0-conductor-db-sync" Mar 08 03:40:03.168181 master-0 kubenswrapper[13046]: I0308 03:40:03.168151 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcbe677-7983-4b97-9146-604451f6b8d6" containerName="dnsmasq-dns" Mar 08 03:40:03.168328 master-0 kubenswrapper[13046]: I0308 03:40:03.168231 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="510d6b1b-5eec-47ab-ba92-23ef35ec5f83" containerName="nova-cell0-conductor-db-sync" Mar 08 03:40:03.169214 master-0 kubenswrapper[13046]: I0308 03:40:03.169165 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.173420 master-0 kubenswrapper[13046]: I0308 03:40:03.173361 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 03:40:03.183886 master-0 kubenswrapper[13046]: I0308 03:40:03.183814 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 03:40:03.209745 master-0 kubenswrapper[13046]: I0308 03:40:03.208341 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.209745 master-0 kubenswrapper[13046]: I0308 03:40:03.208585 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.209745 master-0 kubenswrapper[13046]: I0308 03:40:03.208958 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vzb\" (UniqueName: \"kubernetes.io/projected/c16893d4-11ce-49ea-b9a9-be326e7887de-kube-api-access-95vzb\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.310262 master-0 kubenswrapper[13046]: I0308 03:40:03.310131 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vzb\" (UniqueName: \"kubernetes.io/projected/c16893d4-11ce-49ea-b9a9-be326e7887de-kube-api-access-95vzb\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.310262 master-0 kubenswrapper[13046]: I0308 03:40:03.310263 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.310517 master-0 kubenswrapper[13046]: I0308 03:40:03.310312 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.314551 master-0 kubenswrapper[13046]: I0308 03:40:03.314444 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.326296 master-0 kubenswrapper[13046]: I0308 03:40:03.326224 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16893d4-11ce-49ea-b9a9-be326e7887de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.331542 master-0 kubenswrapper[13046]: I0308 03:40:03.331471 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vzb\" (UniqueName: \"kubernetes.io/projected/c16893d4-11ce-49ea-b9a9-be326e7887de-kube-api-access-95vzb\") pod \"nova-cell0-conductor-0\" (UID: \"c16893d4-11ce-49ea-b9a9-be326e7887de\") " pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:03.490444 master-0 kubenswrapper[13046]: I0308 03:40:03.490358 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:04.087752 master-0 kubenswrapper[13046]: W0308 03:40:04.086159 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc16893d4_11ce_49ea_b9a9_be326e7887de.slice/crio-5692ae2c85ac96df7b43fa7c90e299eaf6349a81503e1e563d5cf63108d07bc8 WatchSource:0}: Error finding container 5692ae2c85ac96df7b43fa7c90e299eaf6349a81503e1e563d5cf63108d07bc8: Status 404 returned error can't find the container with id 5692ae2c85ac96df7b43fa7c90e299eaf6349a81503e1e563d5cf63108d07bc8 Mar 08 03:40:04.092494 master-0 kubenswrapper[13046]: I0308 03:40:04.089521 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 03:40:05.097664 master-0 kubenswrapper[13046]: I0308 03:40:05.097574 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c16893d4-11ce-49ea-b9a9-be326e7887de","Type":"ContainerStarted","Data":"5f20548e26c3c02533c791eb685266053bddc83def464e9fd26dc8c495b860bb"} Mar 08 03:40:05.097664 master-0 kubenswrapper[13046]: I0308 03:40:05.097635 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c16893d4-11ce-49ea-b9a9-be326e7887de","Type":"ContainerStarted","Data":"5692ae2c85ac96df7b43fa7c90e299eaf6349a81503e1e563d5cf63108d07bc8"} Mar 08 03:40:05.097664 master-0 kubenswrapper[13046]: I0308 03:40:05.097677 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:05.118121 master-0 kubenswrapper[13046]: I0308 03:40:05.118027 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.118007738 podStartE2EDuration="2.118007738s" podCreationTimestamp="2026-03-08 03:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:05.117942416 +0000 UTC m=+1607.196709643" watchObservedRunningTime="2026-03-08 03:40:05.118007738 +0000 UTC m=+1607.196774955" Mar 08 03:40:13.530721 master-0 kubenswrapper[13046]: I0308 03:40:13.530678 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 03:40:14.164593 master-0 kubenswrapper[13046]: I0308 03:40:14.164533 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hrdjm"] Mar 08 03:40:14.165899 master-0 kubenswrapper[13046]: I0308 03:40:14.165862 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hrdjm"] Mar 08 03:40:14.166114 master-0 kubenswrapper[13046]: I0308 03:40:14.166053 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.172090 master-0 kubenswrapper[13046]: I0308 03:40:14.172031 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 03:40:14.172315 master-0 kubenswrapper[13046]: I0308 03:40:14.172183 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 03:40:14.295511 master-0 kubenswrapper[13046]: I0308 03:40:14.294584 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 03:40:14.316510 master-0 kubenswrapper[13046]: I0308 03:40:14.309660 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.316510 master-0 kubenswrapper[13046]: I0308 03:40:14.311985 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.316510 master-0 kubenswrapper[13046]: I0308 03:40:14.312097 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.316510 master-0 kubenswrapper[13046]: I0308 03:40:14.312171 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.316510 master-0 kubenswrapper[13046]: I0308 03:40:14.312205 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mqc\" (UniqueName: \"kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.327501 master-0 kubenswrapper[13046]: I0308 03:40:14.322196 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 08 03:40:14.367377 master-0 kubenswrapper[13046]: I0308 03:40:14.362530 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.417637 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.417769 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.417876 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbgh\" (UniqueName: \"kubernetes.io/projected/63cbc85a-14cf-42d6-90a3-a6f4199557f9-kube-api-access-6gbgh\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.417915 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.418030 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.418879 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.419225 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78mqc\" (UniqueName: \"kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.422808 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.431112 master-0 kubenswrapper[13046]: I0308 03:40:14.423171 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.433210 master-0 kubenswrapper[13046]: I0308 03:40:14.433167 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.433286 master-0 kubenswrapper[13046]: I0308 03:40:14.433243 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:14.435170 master-0 kubenswrapper[13046]: I0308 03:40:14.435145 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:14.439408 master-0 kubenswrapper[13046]: I0308 03:40:14.437504 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 03:40:14.451382 master-0 kubenswrapper[13046]: I0308 03:40:14.448332 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mqc\" (UniqueName: \"kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc\") pod \"nova-cell0-cell-mapping-hrdjm\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.464073 master-0 kubenswrapper[13046]: I0308 03:40:14.451914 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:14.464073 master-0 kubenswrapper[13046]: I0308 03:40:14.462573 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:14.464577 master-0 kubenswrapper[13046]: I0308 03:40:14.464522 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:14.469660 master-0 kubenswrapper[13046]: I0308 03:40:14.468401 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 03:40:14.500403 master-0 kubenswrapper[13046]: I0308 03:40:14.500330 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:14.501065 master-0 kubenswrapper[13046]: I0308 03:40:14.500931 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523678 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523719 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tr7\" (UniqueName: \"kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523784 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523844 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523915 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbgh\" (UniqueName: \"kubernetes.io/projected/63cbc85a-14cf-42d6-90a3-a6f4199557f9-kube-api-access-6gbgh\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.523972 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.524004 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.527508 master-0 kubenswrapper[13046]: I0308 03:40:14.527004 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.531504 master-0 kubenswrapper[13046]: I0308 03:40:14.530291 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63cbc85a-14cf-42d6-90a3-a6f4199557f9-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.585464 master-0 kubenswrapper[13046]: I0308 03:40:14.585423 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbgh\" (UniqueName: \"kubernetes.io/projected/63cbc85a-14cf-42d6-90a3-a6f4199557f9-kube-api-access-6gbgh\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"63cbc85a-14cf-42d6-90a3-a6f4199557f9\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628340 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628413 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628463 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628499 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628584 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7rw\" (UniqueName: \"kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628611 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.628826 master-0 kubenswrapper[13046]: I0308 03:40:14.628628 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tr7\" (UniqueName: \"kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.641562 master-0 kubenswrapper[13046]: I0308 03:40:14.639598 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.644595 master-0 kubenswrapper[13046]: I0308 03:40:14.644556 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.647105 master-0 kubenswrapper[13046]: I0308 03:40:14.645603 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.661438 master-0 kubenswrapper[13046]: I0308 03:40:14.656304 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:14.669764 master-0 kubenswrapper[13046]: I0308 03:40:14.665265 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:14.674360 master-0 kubenswrapper[13046]: I0308 03:40:14.673831 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 03:40:14.686818 master-0 kubenswrapper[13046]: I0308 03:40:14.678506 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:14.686818 master-0 kubenswrapper[13046]: I0308 03:40:14.682900 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tr7\" (UniqueName: \"kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7\") pod \"nova-api-0\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " pod="openstack/nova-api-0" Mar 08 03:40:14.686818 master-0 kubenswrapper[13046]: I0308 03:40:14.683074 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:14.712193 master-0 kubenswrapper[13046]: I0308 03:40:14.706173 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:14.712193 master-0 kubenswrapper[13046]: I0308 03:40:14.707672 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:14.712193 master-0 kubenswrapper[13046]: I0308 03:40:14.707745 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.740803 master-0 kubenswrapper[13046]: I0308 03:40:14.739537 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 03:40:14.760700 master-0 kubenswrapper[13046]: I0308 03:40:14.751957 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7rw\" (UniqueName: \"kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.760700 master-0 kubenswrapper[13046]: I0308 03:40:14.752312 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.760700 master-0 kubenswrapper[13046]: I0308 03:40:14.752344 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.788512 master-0 kubenswrapper[13046]: I0308 03:40:14.785734 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:14.789725 master-0 kubenswrapper[13046]: I0308 03:40:14.789414 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.793577 master-0 kubenswrapper[13046]: I0308 03:40:14.792828 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.857877 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.857940 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.857974 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tv4t\" (UniqueName: \"kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.857989 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.858011 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.858028 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.858729 master-0 kubenswrapper[13046]: I0308 03:40:14.858192 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55lbh\" (UniqueName: \"kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.873247 master-0 kubenswrapper[13046]: I0308 03:40:14.864096 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7rw\" (UniqueName: \"kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw\") pod \"nova-scheduler-0\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:14.875023 master-0 kubenswrapper[13046]: I0308 03:40:14.874497 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968601 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968659 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968692 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tv4t\" (UniqueName: \"kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968714 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968733 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968753 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.969063 master-0 kubenswrapper[13046]: I0308 03:40:14.968938 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55lbh\" (UniqueName: \"kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:14.979664 master-0 kubenswrapper[13046]: I0308 03:40:14.972312 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:14.996245 master-0 kubenswrapper[13046]: I0308 03:40:14.995903 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:15.010636 master-0 kubenswrapper[13046]: I0308 03:40:15.008323 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55lbh\" (UniqueName: \"kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:15.023040 master-0 kubenswrapper[13046]: I0308 03:40:15.017081 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tv4t\" (UniqueName: \"kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:15.059813 master-0 kubenswrapper[13046]: I0308 03:40:15.050272 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:15.074536 master-0 kubenswrapper[13046]: I0308 03:40:15.062150 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:40:15.074536 master-0 kubenswrapper[13046]: I0308 03:40:15.065159 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:15.074536 master-0 kubenswrapper[13046]: I0308 03:40:15.065232 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " pod="openstack/nova-metadata-0" Mar 08 03:40:15.077145 master-0 kubenswrapper[13046]: I0308 03:40:15.075559 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.095077 master-0 kubenswrapper[13046]: I0308 03:40:15.092793 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.194444 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.194678 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.194908 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.194934 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.194998 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.196538 master-0 kubenswrapper[13046]: I0308 03:40:15.195060 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pqg\" (UniqueName: \"kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.276271 master-0 kubenswrapper[13046]: I0308 03:40:15.276175 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:15.309297 master-0 kubenswrapper[13046]: I0308 03:40:15.309225 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.310294 master-0 kubenswrapper[13046]: I0308 03:40:15.310222 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.310450 master-0 kubenswrapper[13046]: I0308 03:40:15.310395 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.310584 master-0 kubenswrapper[13046]: I0308 03:40:15.310550 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.310740 master-0 kubenswrapper[13046]: I0308 03:40:15.310687 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.310897 master-0 kubenswrapper[13046]: I0308 03:40:15.310855 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pqg\" (UniqueName: \"kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.314657 master-0 kubenswrapper[13046]: I0308 03:40:15.311120 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.314657 master-0 kubenswrapper[13046]: I0308 03:40:15.311718 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.315540 master-0 kubenswrapper[13046]: I0308 03:40:15.314901 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.319144 master-0 kubenswrapper[13046]: I0308 03:40:15.315897 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.319144 master-0 kubenswrapper[13046]: I0308 03:40:15.316408 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.333755 master-0 kubenswrapper[13046]: I0308 03:40:15.333713 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pqg\" (UniqueName: \"kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg\") pod \"dnsmasq-dns-57cbb5d5bf-qr52m\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.333755 master-0 kubenswrapper[13046]: I0308 03:40:15.333732 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:15.475728 master-0 kubenswrapper[13046]: I0308 03:40:15.472850 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:15.486619 master-0 kubenswrapper[13046]: W0308 03:40:15.485867 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3bd0477_6b38_4d58_b53b_34c4c323496b.slice/crio-d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b WatchSource:0}: Error finding container d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b: Status 404 returned error can't find the container with id d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b Mar 08 03:40:15.545202 master-0 kubenswrapper[13046]: I0308 03:40:15.545127 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hrdjm"] Mar 08 03:40:15.579912 master-0 kubenswrapper[13046]: W0308 03:40:15.579768 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63cbc85a_14cf_42d6_90a3_a6f4199557f9.slice/crio-2e607ff95a16b5a0ee631f043a02eb867213501ce191641b893513dd9469137a WatchSource:0}: Error finding container 2e607ff95a16b5a0ee631f043a02eb867213501ce191641b893513dd9469137a: Status 404 returned error can't find the container with id 2e607ff95a16b5a0ee631f043a02eb867213501ce191641b893513dd9469137a Mar 08 03:40:15.599835 master-0 kubenswrapper[13046]: I0308 03:40:15.599748 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 08 03:40:15.798898 master-0 kubenswrapper[13046]: I0308 03:40:15.798696 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:15.919737 master-0 kubenswrapper[13046]: I0308 03:40:15.918458 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:16.021319 master-0 kubenswrapper[13046]: I0308 03:40:16.013642 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmrk"] Mar 08 03:40:16.021319 master-0 kubenswrapper[13046]: I0308 03:40:16.015266 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.033760 master-0 kubenswrapper[13046]: I0308 03:40:16.027851 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 03:40:16.033760 master-0 kubenswrapper[13046]: I0308 03:40:16.028048 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 03:40:16.045856 master-0 kubenswrapper[13046]: I0308 03:40:16.044256 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmrk"] Mar 08 03:40:16.052327 master-0 kubenswrapper[13046]: I0308 03:40:16.048745 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.052327 master-0 kubenswrapper[13046]: I0308 03:40:16.049160 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c87b\" (UniqueName: \"kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.052327 master-0 kubenswrapper[13046]: I0308 03:40:16.049447 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.052327 master-0 kubenswrapper[13046]: I0308 03:40:16.049514 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.110585 master-0 kubenswrapper[13046]: I0308 03:40:16.110133 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:16.151325 master-0 kubenswrapper[13046]: I0308 03:40:16.151280 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.151460 master-0 kubenswrapper[13046]: I0308 03:40:16.151440 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c87b\" (UniqueName: \"kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.151515 master-0 kubenswrapper[13046]: I0308 03:40:16.151489 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.151515 master-0 kubenswrapper[13046]: I0308 03:40:16.151513 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.158582 master-0 kubenswrapper[13046]: I0308 03:40:16.156847 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.159067 master-0 kubenswrapper[13046]: I0308 03:40:16.159034 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.161706 master-0 kubenswrapper[13046]: I0308 03:40:16.161662 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.167720 master-0 kubenswrapper[13046]: I0308 03:40:16.167682 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c87b\" (UniqueName: \"kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b\") pod \"nova-cell1-conductor-db-sync-lnmrk\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.188340 master-0 kubenswrapper[13046]: I0308 03:40:16.188287 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:16.303565 master-0 kubenswrapper[13046]: I0308 03:40:16.301925 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hrdjm" event={"ID":"b3bd0477-6b38-4d58-b53b-34c4c323496b","Type":"ContainerStarted","Data":"77a9e41b555abba36489ce3052653b67e27e7fb12a9ba0583a2d3384a02f51f1"} Mar 08 03:40:16.303565 master-0 kubenswrapper[13046]: I0308 03:40:16.301991 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hrdjm" event={"ID":"b3bd0477-6b38-4d58-b53b-34c4c323496b","Type":"ContainerStarted","Data":"d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b"} Mar 08 03:40:16.307806 master-0 kubenswrapper[13046]: I0308 03:40:16.307782 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c41abde-9569-460b-9461-69eabb5eb006","Type":"ContainerStarted","Data":"611d5789c67e314393f9393a7c114b42be7c67c97a03086b40b819e8dcd516a1"} Mar 08 03:40:16.308790 master-0 kubenswrapper[13046]: I0308 03:40:16.308771 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"63cbc85a-14cf-42d6-90a3-a6f4199557f9","Type":"ContainerStarted","Data":"2e607ff95a16b5a0ee631f043a02eb867213501ce191641b893513dd9469137a"} Mar 08 03:40:16.310625 master-0 kubenswrapper[13046]: I0308 03:40:16.310580 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerStarted","Data":"c2c5dd1e6918cc6ca9ce732faeef9dccaae8b6516eba2fa1b01c3e5d4fde44ee"} Mar 08 03:40:16.323148 master-0 kubenswrapper[13046]: I0308 03:40:16.323100 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"223afba7-1a3a-4d5e-b512-ecafdccc5dab","Type":"ContainerStarted","Data":"5c3461b6cc6c5c6c779123a8afe3f1cf9a2f8eb16a8ade241c5f79326bf4f373"} Mar 08 03:40:16.336259 master-0 kubenswrapper[13046]: I0308 03:40:16.336184 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hrdjm" podStartSLOduration=2.336163936 podStartE2EDuration="2.336163936s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:16.322059556 +0000 UTC m=+1618.400826773" watchObservedRunningTime="2026-03-08 03:40:16.336163936 +0000 UTC m=+1618.414931153" Mar 08 03:40:16.337802 master-0 kubenswrapper[13046]: I0308 03:40:16.337764 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerStarted","Data":"53ecc5ccb822cd3dfb07fc80eb1c9193f788786e6a1ad09131506c2948e4415a"} Mar 08 03:40:16.356361 master-0 kubenswrapper[13046]: I0308 03:40:16.356293 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:40:16.381819 master-0 kubenswrapper[13046]: I0308 03:40:16.381768 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:16.940364 master-0 kubenswrapper[13046]: I0308 03:40:16.939533 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmrk"] Mar 08 03:40:17.360621 master-0 kubenswrapper[13046]: I0308 03:40:17.360568 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" event={"ID":"854672de-d415-46d9-810b-5f7d085f1969","Type":"ContainerStarted","Data":"8bedf1e50c5add1110f8c79fef570aa9c2d855d1823c1c3b5a7186f415883c04"} Mar 08 03:40:17.360621 master-0 kubenswrapper[13046]: I0308 03:40:17.360626 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" event={"ID":"854672de-d415-46d9-810b-5f7d085f1969","Type":"ContainerStarted","Data":"9733682302f4d750b08b91d5414c3d3f749294935979d32e1113bde31ee86933"} Mar 08 03:40:17.366882 master-0 kubenswrapper[13046]: I0308 03:40:17.366689 13046 generic.go:334] "Generic (PLEG): container finished" podID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerID="b89608461ea12d4cd9e07bebc2bf6d9eb64ab32e8bc3527426d8a8cddbdda400" exitCode=0 Mar 08 03:40:17.367602 master-0 kubenswrapper[13046]: I0308 03:40:17.366812 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" event={"ID":"15ec3f9b-6617-4580-8758-0bfd755ff867","Type":"ContainerDied","Data":"b89608461ea12d4cd9e07bebc2bf6d9eb64ab32e8bc3527426d8a8cddbdda400"} Mar 08 03:40:17.371181 master-0 kubenswrapper[13046]: I0308 03:40:17.370672 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" event={"ID":"15ec3f9b-6617-4580-8758-0bfd755ff867","Type":"ContainerStarted","Data":"90042b7fffafbb65106289721a85333ba5f7cea3fb2ef8fddf82f026e87dd2f2"} Mar 08 03:40:17.410505 master-0 kubenswrapper[13046]: I0308 03:40:17.404783 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" podStartSLOduration=2.404759197 podStartE2EDuration="2.404759197s" podCreationTimestamp="2026-03-08 03:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:17.385004438 +0000 UTC m=+1619.463771655" watchObservedRunningTime="2026-03-08 03:40:17.404759197 +0000 UTC m=+1619.483526434" Mar 08 03:40:18.506508 master-0 kubenswrapper[13046]: I0308 03:40:18.501584 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:18.525460 master-0 kubenswrapper[13046]: I0308 03:40:18.525408 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:21.452797 master-0 kubenswrapper[13046]: I0308 03:40:21.452733 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"223afba7-1a3a-4d5e-b512-ecafdccc5dab","Type":"ContainerStarted","Data":"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725"} Mar 08 03:40:21.453670 master-0 kubenswrapper[13046]: I0308 03:40:21.452935 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725" gracePeriod=30 Mar 08 03:40:21.456854 master-0 kubenswrapper[13046]: I0308 03:40:21.456640 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerStarted","Data":"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56"} Mar 08 03:40:21.456854 master-0 kubenswrapper[13046]: I0308 03:40:21.456734 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerStarted","Data":"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7"} Mar 08 03:40:21.465995 master-0 kubenswrapper[13046]: I0308 03:40:21.463743 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-log" containerID="cri-o://1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" gracePeriod=30 Mar 08 03:40:21.465995 master-0 kubenswrapper[13046]: I0308 03:40:21.463857 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-metadata" containerID="cri-o://5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" gracePeriod=30 Mar 08 03:40:21.492013 master-0 kubenswrapper[13046]: I0308 03:40:21.491889 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c41abde-9569-460b-9461-69eabb5eb006","Type":"ContainerStarted","Data":"41f768ca2eaf5193c192195022780bad5bc651063531cb75afa60ba1a45cf69d"} Mar 08 03:40:21.507771 master-0 kubenswrapper[13046]: I0308 03:40:21.507702 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" event={"ID":"15ec3f9b-6617-4580-8758-0bfd755ff867","Type":"ContainerStarted","Data":"09646a1f2a1f1f151e481f09da6d16e51e3e13038d73293586d40132f11078b6"} Mar 08 03:40:21.508683 master-0 kubenswrapper[13046]: I0308 03:40:21.508399 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:21.513538 master-0 kubenswrapper[13046]: I0308 03:40:21.511610 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerStarted","Data":"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda"} Mar 08 03:40:21.513538 master-0 kubenswrapper[13046]: I0308 03:40:21.511642 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerStarted","Data":"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc"} Mar 08 03:40:21.541521 master-0 kubenswrapper[13046]: I0308 03:40:21.540439 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.326383955 podStartE2EDuration="7.540419142s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="2026-03-08 03:40:16.111585451 +0000 UTC m=+1618.190352668" lastFinishedPulling="2026-03-08 03:40:20.325620638 +0000 UTC m=+1622.404387855" observedRunningTime="2026-03-08 03:40:21.532900479 +0000 UTC m=+1623.611667696" watchObservedRunningTime="2026-03-08 03:40:21.540419142 +0000 UTC m=+1623.619186359" Mar 08 03:40:21.615712 master-0 kubenswrapper[13046]: I0308 03:40:21.613400 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.396002209 podStartE2EDuration="7.61338221s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="2026-03-08 03:40:16.14011314 +0000 UTC m=+1618.218880357" lastFinishedPulling="2026-03-08 03:40:20.357493141 +0000 UTC m=+1622.436260358" observedRunningTime="2026-03-08 03:40:21.612626778 +0000 UTC m=+1623.691394015" watchObservedRunningTime="2026-03-08 03:40:21.61338221 +0000 UTC m=+1623.692149427" Mar 08 03:40:21.648192 master-0 kubenswrapper[13046]: I0308 03:40:21.645774 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.1284842680000002 podStartE2EDuration="7.645754047s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="2026-03-08 03:40:15.801585667 +0000 UTC m=+1617.880352884" lastFinishedPulling="2026-03-08 03:40:20.318855446 +0000 UTC m=+1622.397622663" observedRunningTime="2026-03-08 03:40:21.63597377 +0000 UTC m=+1623.714740987" watchObservedRunningTime="2026-03-08 03:40:21.645754047 +0000 UTC m=+1623.724521254" Mar 08 03:40:21.675356 master-0 kubenswrapper[13046]: I0308 03:40:21.675240 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" podStartSLOduration=7.675214102 podStartE2EDuration="7.675214102s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:21.666660219 +0000 UTC m=+1623.745427446" watchObservedRunningTime="2026-03-08 03:40:21.675214102 +0000 UTC m=+1623.753981319" Mar 08 03:40:21.707085 master-0 kubenswrapper[13046]: I0308 03:40:21.706887 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.2894573400000002 podStartE2EDuration="7.706861139s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="2026-03-08 03:40:15.936668565 +0000 UTC m=+1618.015435782" lastFinishedPulling="2026-03-08 03:40:20.354072364 +0000 UTC m=+1622.432839581" observedRunningTime="2026-03-08 03:40:21.689091865 +0000 UTC m=+1623.767859092" watchObservedRunningTime="2026-03-08 03:40:21.706861139 +0000 UTC m=+1623.785628356" Mar 08 03:40:22.213228 master-0 kubenswrapper[13046]: I0308 03:40:22.213173 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:22.329713 master-0 kubenswrapper[13046]: I0308 03:40:22.329558 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data\") pod \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " Mar 08 03:40:22.329713 master-0 kubenswrapper[13046]: I0308 03:40:22.329637 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tv4t\" (UniqueName: \"kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t\") pod \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " Mar 08 03:40:22.330078 master-0 kubenswrapper[13046]: I0308 03:40:22.329753 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs\") pod \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " Mar 08 03:40:22.330078 master-0 kubenswrapper[13046]: I0308 03:40:22.329868 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle\") pod \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\" (UID: \"425a364f-d62f-47cb-8203-c4f94d3f5ee1\") " Mar 08 03:40:22.330667 master-0 kubenswrapper[13046]: I0308 03:40:22.330633 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs" (OuterVolumeSpecName: "logs") pod "425a364f-d62f-47cb-8203-c4f94d3f5ee1" (UID: "425a364f-d62f-47cb-8203-c4f94d3f5ee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:40:22.332268 master-0 kubenswrapper[13046]: I0308 03:40:22.332222 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425a364f-d62f-47cb-8203-c4f94d3f5ee1-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:22.341741 master-0 kubenswrapper[13046]: I0308 03:40:22.337442 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t" (OuterVolumeSpecName: "kube-api-access-7tv4t") pod "425a364f-d62f-47cb-8203-c4f94d3f5ee1" (UID: "425a364f-d62f-47cb-8203-c4f94d3f5ee1"). InnerVolumeSpecName "kube-api-access-7tv4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:22.356746 master-0 kubenswrapper[13046]: I0308 03:40:22.356694 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data" (OuterVolumeSpecName: "config-data") pod "425a364f-d62f-47cb-8203-c4f94d3f5ee1" (UID: "425a364f-d62f-47cb-8203-c4f94d3f5ee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:22.374257 master-0 kubenswrapper[13046]: I0308 03:40:22.374209 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425a364f-d62f-47cb-8203-c4f94d3f5ee1" (UID: "425a364f-d62f-47cb-8203-c4f94d3f5ee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:22.434778 master-0 kubenswrapper[13046]: I0308 03:40:22.434729 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:22.434778 master-0 kubenswrapper[13046]: I0308 03:40:22.434768 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tv4t\" (UniqueName: \"kubernetes.io/projected/425a364f-d62f-47cb-8203-c4f94d3f5ee1-kube-api-access-7tv4t\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:22.434778 master-0 kubenswrapper[13046]: I0308 03:40:22.434779 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425a364f-d62f-47cb-8203-c4f94d3f5ee1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:22.530183 master-0 kubenswrapper[13046]: I0308 03:40:22.530129 13046 generic.go:334] "Generic (PLEG): container finished" podID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerID="5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" exitCode=0 Mar 08 03:40:22.530183 master-0 kubenswrapper[13046]: I0308 03:40:22.530165 13046 generic.go:334] "Generic (PLEG): container finished" podID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerID="1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" exitCode=143 Mar 08 03:40:22.530183 master-0 kubenswrapper[13046]: I0308 03:40:22.530178 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:22.530775 master-0 kubenswrapper[13046]: I0308 03:40:22.530256 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerDied","Data":"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56"} Mar 08 03:40:22.530775 master-0 kubenswrapper[13046]: I0308 03:40:22.530283 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerDied","Data":"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7"} Mar 08 03:40:22.530775 master-0 kubenswrapper[13046]: I0308 03:40:22.530293 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"425a364f-d62f-47cb-8203-c4f94d3f5ee1","Type":"ContainerDied","Data":"53ecc5ccb822cd3dfb07fc80eb1c9193f788786e6a1ad09131506c2948e4415a"} Mar 08 03:40:22.530775 master-0 kubenswrapper[13046]: I0308 03:40:22.530312 13046 scope.go:117] "RemoveContainer" containerID="5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" Mar 08 03:40:22.548514 master-0 kubenswrapper[13046]: E0308 03:40:22.548462 13046 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/nova-metadata-0_openstack_nova-metadata-metadata-5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56.log: no such file or directory" path="/var/log/containers/nova-metadata-0_openstack_nova-metadata-metadata-5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56.log" Mar 08 03:40:22.561045 master-0 kubenswrapper[13046]: I0308 03:40:22.561008 13046 scope.go:117] "RemoveContainer" containerID="1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" Mar 08 03:40:22.577987 master-0 kubenswrapper[13046]: I0308 03:40:22.577920 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:22.590877 master-0 kubenswrapper[13046]: I0308 03:40:22.590092 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:22.612048 master-0 kubenswrapper[13046]: I0308 03:40:22.611930 13046 scope.go:117] "RemoveContainer" containerID="5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" Mar 08 03:40:22.617692 master-0 kubenswrapper[13046]: I0308 03:40:22.617327 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:22.620470 master-0 kubenswrapper[13046]: E0308 03:40:22.620223 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56\": container with ID starting with 5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56 not found: ID does not exist" containerID="5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" Mar 08 03:40:22.620470 master-0 kubenswrapper[13046]: I0308 03:40:22.620278 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56"} err="failed to get container status \"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56\": rpc error: code = NotFound desc = could not find container \"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56\": container with ID starting with 5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56 not found: ID does not exist" Mar 08 03:40:22.620470 master-0 kubenswrapper[13046]: I0308 03:40:22.620305 13046 scope.go:117] "RemoveContainer" containerID="1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: E0308 03:40:22.621805 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7\": container with ID starting with 1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7 not found: ID does not exist" containerID="1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.621830 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7"} err="failed to get container status \"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7\": rpc error: code = NotFound desc = could not find container \"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7\": container with ID starting with 1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7 not found: ID does not exist" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.621848 13046 scope.go:117] "RemoveContainer" containerID="5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: E0308 03:40:22.623168 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-log" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.623208 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-log" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: E0308 03:40:22.623227 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-metadata" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.623233 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-metadata" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.623695 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-log" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.623708 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" containerName="nova-metadata-metadata" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.624593 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56"} err="failed to get container status \"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56\": rpc error: code = NotFound desc = could not find container \"5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56\": container with ID starting with 5aac4e877051350a54564b491afece83f9143a82ff67340733e8385b597b9c56 not found: ID does not exist" Mar 08 03:40:22.629360 master-0 kubenswrapper[13046]: I0308 03:40:22.624622 13046 scope.go:117] "RemoveContainer" containerID="1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7" Mar 08 03:40:22.643216 master-0 kubenswrapper[13046]: I0308 03:40:22.642963 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7"} err="failed to get container status \"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7\": rpc error: code = NotFound desc = could not find container \"1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7\": container with ID starting with 1d7129d4fcc844040d51fa3b17f1d9b12a8751648f671cfad92694584afa91b7 not found: ID does not exist" Mar 08 03:40:22.650817 master-0 kubenswrapper[13046]: I0308 03:40:22.650416 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:22.650817 master-0 kubenswrapper[13046]: I0308 03:40:22.650544 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:22.661066 master-0 kubenswrapper[13046]: I0308 03:40:22.660240 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 03:40:22.661066 master-0 kubenswrapper[13046]: I0308 03:40:22.660880 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 03:40:22.744403 master-0 kubenswrapper[13046]: I0308 03:40:22.744359 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.744700 master-0 kubenswrapper[13046]: I0308 03:40:22.744681 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.744828 master-0 kubenswrapper[13046]: I0308 03:40:22.744813 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.744911 master-0 kubenswrapper[13046]: I0308 03:40:22.744899 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.744996 master-0 kubenswrapper[13046]: I0308 03:40:22.744983 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7fs\" (UniqueName: \"kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.847277 master-0 kubenswrapper[13046]: I0308 03:40:22.847174 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.847505 master-0 kubenswrapper[13046]: I0308 03:40:22.847477 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.847633 master-0 kubenswrapper[13046]: I0308 03:40:22.847620 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7fs\" (UniqueName: \"kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.847805 master-0 kubenswrapper[13046]: I0308 03:40:22.847791 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.847947 master-0 kubenswrapper[13046]: I0308 03:40:22.847932 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.851713 master-0 kubenswrapper[13046]: I0308 03:40:22.851696 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.854903 master-0 kubenswrapper[13046]: I0308 03:40:22.854888 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.855302 master-0 kubenswrapper[13046]: I0308 03:40:22.855288 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.859376 master-0 kubenswrapper[13046]: I0308 03:40:22.859328 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.885353 master-0 kubenswrapper[13046]: I0308 03:40:22.885315 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7fs\" (UniqueName: \"kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs\") pod \"nova-metadata-0\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " pod="openstack/nova-metadata-0" Mar 08 03:40:22.982264 master-0 kubenswrapper[13046]: I0308 03:40:22.982207 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:23.422269 master-0 kubenswrapper[13046]: I0308 03:40:23.422203 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:23.493275 master-0 kubenswrapper[13046]: W0308 03:40:23.493226 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27bed797_75f3_4acf_9349_27eb09c2a7f6.slice/crio-d773ebf3f298020972d5d7b606ac52e412e2f62233034c557c1a542c1ec3a6e0 WatchSource:0}: Error finding container d773ebf3f298020972d5d7b606ac52e412e2f62233034c557c1a542c1ec3a6e0: Status 404 returned error can't find the container with id d773ebf3f298020972d5d7b606ac52e412e2f62233034c557c1a542c1ec3a6e0 Mar 08 03:40:23.546008 master-0 kubenswrapper[13046]: I0308 03:40:23.545944 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerStarted","Data":"d773ebf3f298020972d5d7b606ac52e412e2f62233034c557c1a542c1ec3a6e0"} Mar 08 03:40:23.551347 master-0 kubenswrapper[13046]: I0308 03:40:23.550322 13046 generic.go:334] "Generic (PLEG): container finished" podID="b3bd0477-6b38-4d58-b53b-34c4c323496b" containerID="77a9e41b555abba36489ce3052653b67e27e7fb12a9ba0583a2d3384a02f51f1" exitCode=0 Mar 08 03:40:23.551347 master-0 kubenswrapper[13046]: I0308 03:40:23.550362 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hrdjm" event={"ID":"b3bd0477-6b38-4d58-b53b-34c4c323496b","Type":"ContainerDied","Data":"77a9e41b555abba36489ce3052653b67e27e7fb12a9ba0583a2d3384a02f51f1"} Mar 08 03:40:24.146315 master-0 kubenswrapper[13046]: I0308 03:40:24.146255 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425a364f-d62f-47cb-8203-c4f94d3f5ee1" path="/var/lib/kubelet/pods/425a364f-d62f-47cb-8203-c4f94d3f5ee1/volumes" Mar 08 03:40:24.567960 master-0 kubenswrapper[13046]: I0308 03:40:24.567763 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerStarted","Data":"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c"} Mar 08 03:40:24.788071 master-0 kubenswrapper[13046]: I0308 03:40:24.787425 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:40:24.788071 master-0 kubenswrapper[13046]: I0308 03:40:24.787552 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:40:24.884324 master-0 kubenswrapper[13046]: I0308 03:40:24.884266 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 03:40:24.885810 master-0 kubenswrapper[13046]: I0308 03:40:24.885781 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 03:40:24.928600 master-0 kubenswrapper[13046]: I0308 03:40:24.928516 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 03:40:25.334714 master-0 kubenswrapper[13046]: I0308 03:40:25.334587 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:25.478400 master-0 kubenswrapper[13046]: I0308 03:40:25.478329 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:40:25.698502 master-0 kubenswrapper[13046]: I0308 03:40:25.695510 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:40:25.698502 master-0 kubenswrapper[13046]: I0308 03:40:25.695798 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="dnsmasq-dns" containerID="cri-o://9ae6a1821890e3ce840e0fc548870c8c23bf2e129aee66ef2fa0e5b43bb3524e" gracePeriod=10 Mar 08 03:40:25.713173 master-0 kubenswrapper[13046]: I0308 03:40:25.711787 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 03:40:25.829089 master-0 kubenswrapper[13046]: I0308 03:40:25.828874 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:25.874246 master-0 kubenswrapper[13046]: I0308 03:40:25.874131 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:29.006034 master-0 kubenswrapper[13046]: I0308 03:40:29.005924 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.250:5353: connect: connection refused" Mar 08 03:40:30.419948 master-0 kubenswrapper[13046]: I0308 03:40:30.419888 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:30.603118 master-0 kubenswrapper[13046]: I0308 03:40:30.603074 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle\") pod \"b3bd0477-6b38-4d58-b53b-34c4c323496b\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " Mar 08 03:40:30.603373 master-0 kubenswrapper[13046]: I0308 03:40:30.603221 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78mqc\" (UniqueName: \"kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc\") pod \"b3bd0477-6b38-4d58-b53b-34c4c323496b\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " Mar 08 03:40:30.603440 master-0 kubenswrapper[13046]: I0308 03:40:30.603351 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data\") pod \"b3bd0477-6b38-4d58-b53b-34c4c323496b\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " Mar 08 03:40:30.603498 master-0 kubenswrapper[13046]: I0308 03:40:30.603469 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts\") pod \"b3bd0477-6b38-4d58-b53b-34c4c323496b\" (UID: \"b3bd0477-6b38-4d58-b53b-34c4c323496b\") " Mar 08 03:40:30.607732 master-0 kubenswrapper[13046]: I0308 03:40:30.607668 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc" (OuterVolumeSpecName: "kube-api-access-78mqc") pod "b3bd0477-6b38-4d58-b53b-34c4c323496b" (UID: "b3bd0477-6b38-4d58-b53b-34c4c323496b"). InnerVolumeSpecName "kube-api-access-78mqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:30.610726 master-0 kubenswrapper[13046]: I0308 03:40:30.609441 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts" (OuterVolumeSpecName: "scripts") pod "b3bd0477-6b38-4d58-b53b-34c4c323496b" (UID: "b3bd0477-6b38-4d58-b53b-34c4c323496b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:30.646047 master-0 kubenswrapper[13046]: I0308 03:40:30.645988 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data" (OuterVolumeSpecName: "config-data") pod "b3bd0477-6b38-4d58-b53b-34c4c323496b" (UID: "b3bd0477-6b38-4d58-b53b-34c4c323496b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:30.657020 master-0 kubenswrapper[13046]: I0308 03:40:30.656972 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3bd0477-6b38-4d58-b53b-34c4c323496b" (UID: "b3bd0477-6b38-4d58-b53b-34c4c323496b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:30.685214 master-0 kubenswrapper[13046]: I0308 03:40:30.685162 13046 generic.go:334] "Generic (PLEG): container finished" podID="854672de-d415-46d9-810b-5f7d085f1969" containerID="8bedf1e50c5add1110f8c79fef570aa9c2d855d1823c1c3b5a7186f415883c04" exitCode=0 Mar 08 03:40:30.685290 master-0 kubenswrapper[13046]: I0308 03:40:30.685265 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" event={"ID":"854672de-d415-46d9-810b-5f7d085f1969","Type":"ContainerDied","Data":"8bedf1e50c5add1110f8c79fef570aa9c2d855d1823c1c3b5a7186f415883c04"} Mar 08 03:40:30.690276 master-0 kubenswrapper[13046]: I0308 03:40:30.690241 13046 generic.go:334] "Generic (PLEG): container finished" podID="20c587ce-777a-463e-877b-ada22b560d4c" containerID="9ae6a1821890e3ce840e0fc548870c8c23bf2e129aee66ef2fa0e5b43bb3524e" exitCode=0 Mar 08 03:40:30.690339 master-0 kubenswrapper[13046]: I0308 03:40:30.690321 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" event={"ID":"20c587ce-777a-463e-877b-ada22b560d4c","Type":"ContainerDied","Data":"9ae6a1821890e3ce840e0fc548870c8c23bf2e129aee66ef2fa0e5b43bb3524e"} Mar 08 03:40:30.690442 master-0 kubenswrapper[13046]: I0308 03:40:30.690418 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:40:30.692374 master-0 kubenswrapper[13046]: I0308 03:40:30.692339 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hrdjm" Mar 08 03:40:30.694197 master-0 kubenswrapper[13046]: I0308 03:40:30.694046 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hrdjm" event={"ID":"b3bd0477-6b38-4d58-b53b-34c4c323496b","Type":"ContainerDied","Data":"d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b"} Mar 08 03:40:30.694301 master-0 kubenswrapper[13046]: I0308 03:40:30.694255 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13631dfde1fe174f4ed87e4b4982d5ba6e97484ce7c1945e1938c2b4357fa1b" Mar 08 03:40:30.696423 master-0 kubenswrapper[13046]: I0308 03:40:30.696376 13046 generic.go:334] "Generic (PLEG): container finished" podID="528b1064-a3b2-4ea4-8584-abeffdbedbbe" containerID="a3f39704bbe83d757e07514378179093996ce1eec717a8549e5e52edc164a630" exitCode=0 Mar 08 03:40:30.696501 master-0 kubenswrapper[13046]: I0308 03:40:30.696428 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerDied","Data":"a3f39704bbe83d757e07514378179093996ce1eec717a8549e5e52edc164a630"} Mar 08 03:40:30.713434 master-0 kubenswrapper[13046]: I0308 03:40:30.713362 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.713434 master-0 kubenswrapper[13046]: I0308 03:40:30.713419 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.713434 master-0 kubenswrapper[13046]: I0308 03:40:30.713430 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd0477-6b38-4d58-b53b-34c4c323496b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.713809 master-0 kubenswrapper[13046]: I0308 03:40:30.713443 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78mqc\" (UniqueName: \"kubernetes.io/projected/b3bd0477-6b38-4d58-b53b-34c4c323496b-kube-api-access-78mqc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.819829 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.819957 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.820751 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.820797 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.820966 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.821274 master-0 kubenswrapper[13046]: I0308 03:40:30.821028 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngcvt\" (UniqueName: \"kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt\") pod \"20c587ce-777a-463e-877b-ada22b560d4c\" (UID: \"20c587ce-777a-463e-877b-ada22b560d4c\") " Mar 08 03:40:30.839028 master-0 kubenswrapper[13046]: I0308 03:40:30.837172 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt" (OuterVolumeSpecName: "kube-api-access-ngcvt") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "kube-api-access-ngcvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:30.910601 master-0 kubenswrapper[13046]: I0308 03:40:30.906019 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:40:30.912829 master-0 kubenswrapper[13046]: I0308 03:40:30.912765 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config" (OuterVolumeSpecName: "config") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:40:30.921338 master-0 kubenswrapper[13046]: I0308 03:40:30.921283 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:40:30.924079 master-0 kubenswrapper[13046]: I0308 03:40:30.924038 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.924079 master-0 kubenswrapper[13046]: I0308 03:40:30.924074 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngcvt\" (UniqueName: \"kubernetes.io/projected/20c587ce-777a-463e-877b-ada22b560d4c-kube-api-access-ngcvt\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.924181 master-0 kubenswrapper[13046]: I0308 03:40:30.924086 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.924181 master-0 kubenswrapper[13046]: I0308 03:40:30.924095 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:30.940457 master-0 kubenswrapper[13046]: I0308 03:40:30.938017 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:40:30.940457 master-0 kubenswrapper[13046]: I0308 03:40:30.938182 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "20c587ce-777a-463e-877b-ada22b560d4c" (UID: "20c587ce-777a-463e-877b-ada22b560d4c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:40:31.026468 master-0 kubenswrapper[13046]: I0308 03:40:31.026399 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:31.026468 master-0 kubenswrapper[13046]: I0308 03:40:31.026440 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20c587ce-777a-463e-877b-ada22b560d4c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:31.635932 master-0 kubenswrapper[13046]: I0308 03:40:31.635876 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:31.637717 master-0 kubenswrapper[13046]: I0308 03:40:31.637681 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-log" containerID="cri-o://49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc" gracePeriod=30 Mar 08 03:40:31.638232 master-0 kubenswrapper[13046]: I0308 03:40:31.638121 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-api" containerID="cri-o://8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda" gracePeriod=30 Mar 08 03:40:31.708707 master-0 kubenswrapper[13046]: I0308 03:40:31.707745 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:31.708707 master-0 kubenswrapper[13046]: I0308 03:40:31.707988 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1c41abde-9569-460b-9461-69eabb5eb006" containerName="nova-scheduler-scheduler" containerID="cri-o://41f768ca2eaf5193c192195022780bad5bc651063531cb75afa60ba1a45cf69d" gracePeriod=30 Mar 08 03:40:31.722648 master-0 kubenswrapper[13046]: I0308 03:40:31.722153 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:31.731158 master-0 kubenswrapper[13046]: I0308 03:40:31.731034 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerStarted","Data":"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1"} Mar 08 03:40:31.735302 master-0 kubenswrapper[13046]: I0308 03:40:31.735275 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"63cbc85a-14cf-42d6-90a3-a6f4199557f9","Type":"ContainerStarted","Data":"ada2da8e67dffb9adaed82dafa9149c1636056338ba243b56e398586885e0e95"} Mar 08 03:40:31.735885 master-0 kubenswrapper[13046]: I0308 03:40:31.735834 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:31.748200 master-0 kubenswrapper[13046]: I0308 03:40:31.748151 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" event={"ID":"20c587ce-777a-463e-877b-ada22b560d4c","Type":"ContainerDied","Data":"b641a36155cb3df4ffedac4ba8a2c997d5cd5a94be70e8d37cb741f925591a10"} Mar 08 03:40:31.748384 master-0 kubenswrapper[13046]: I0308 03:40:31.748211 13046 scope.go:117] "RemoveContainer" containerID="9ae6a1821890e3ce840e0fc548870c8c23bf2e129aee66ef2fa0e5b43bb3524e" Mar 08 03:40:31.748384 master-0 kubenswrapper[13046]: I0308 03:40:31.748371 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fd8d55b9-c4jcr" Mar 08 03:40:31.792802 master-0 kubenswrapper[13046]: I0308 03:40:31.786291 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.786260327 podStartE2EDuration="9.786260327s" podCreationTimestamp="2026-03-08 03:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:31.756691019 +0000 UTC m=+1633.835458246" watchObservedRunningTime="2026-03-08 03:40:31.786260327 +0000 UTC m=+1633.865027544" Mar 08 03:40:31.792802 master-0 kubenswrapper[13046]: I0308 03:40:31.787770 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"d4f0e370e7f7933e85ec7461540efb2f028b0e8ca3907911cf98f995c20be351"} Mar 08 03:40:31.804516 master-0 kubenswrapper[13046]: I0308 03:40:31.803022 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 08 03:40:31.816548 master-0 kubenswrapper[13046]: I0308 03:40:31.814287 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=2.975931835 podStartE2EDuration="17.814266941s" podCreationTimestamp="2026-03-08 03:40:14 +0000 UTC" firstStartedPulling="2026-03-08 03:40:15.582991772 +0000 UTC m=+1617.661758989" lastFinishedPulling="2026-03-08 03:40:30.421326878 +0000 UTC m=+1632.500094095" observedRunningTime="2026-03-08 03:40:31.77897116 +0000 UTC m=+1633.857738378" watchObservedRunningTime="2026-03-08 03:40:31.814266941 +0000 UTC m=+1633.893034158" Mar 08 03:40:31.822954 master-0 kubenswrapper[13046]: I0308 03:40:31.819823 13046 scope.go:117] "RemoveContainer" containerID="4739e085fc55b85aa30b9a7652d9fa14ea69654f8e9802bb852abe2d1f2b0955" Mar 08 03:40:31.879616 master-0 kubenswrapper[13046]: I0308 03:40:31.876659 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:40:31.901893 master-0 kubenswrapper[13046]: I0308 03:40:31.897506 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fd8d55b9-c4jcr"] Mar 08 03:40:32.152629 master-0 kubenswrapper[13046]: I0308 03:40:32.151674 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:32.154416 master-0 kubenswrapper[13046]: I0308 03:40:32.154375 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c587ce-777a-463e-877b-ada22b560d4c" path="/var/lib/kubelet/pods/20c587ce-777a-463e-877b-ada22b560d4c/volumes" Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.267841 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data\") pod \"854672de-d415-46d9-810b-5f7d085f1969\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.267940 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts\") pod \"854672de-d415-46d9-810b-5f7d085f1969\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.268014 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle\") pod \"854672de-d415-46d9-810b-5f7d085f1969\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.268590 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c87b\" (UniqueName: \"kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b\") pod \"854672de-d415-46d9-810b-5f7d085f1969\" (UID: \"854672de-d415-46d9-810b-5f7d085f1969\") " Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.270933 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts" (OuterVolumeSpecName: "scripts") pod "854672de-d415-46d9-810b-5f7d085f1969" (UID: "854672de-d415-46d9-810b-5f7d085f1969"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:32.274502 master-0 kubenswrapper[13046]: I0308 03:40:32.272850 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b" (OuterVolumeSpecName: "kube-api-access-5c87b") pod "854672de-d415-46d9-810b-5f7d085f1969" (UID: "854672de-d415-46d9-810b-5f7d085f1969"). InnerVolumeSpecName "kube-api-access-5c87b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:32.298539 master-0 kubenswrapper[13046]: I0308 03:40:32.296032 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "854672de-d415-46d9-810b-5f7d085f1969" (UID: "854672de-d415-46d9-810b-5f7d085f1969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:32.298539 master-0 kubenswrapper[13046]: I0308 03:40:32.298102 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data" (OuterVolumeSpecName: "config-data") pod "854672de-d415-46d9-810b-5f7d085f1969" (UID: "854672de-d415-46d9-810b-5f7d085f1969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:32.372508 master-0 kubenswrapper[13046]: I0308 03:40:32.372436 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c87b\" (UniqueName: \"kubernetes.io/projected/854672de-d415-46d9-810b-5f7d085f1969-kube-api-access-5c87b\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:32.372508 master-0 kubenswrapper[13046]: I0308 03:40:32.372477 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:32.372508 master-0 kubenswrapper[13046]: I0308 03:40:32.372498 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:32.372508 master-0 kubenswrapper[13046]: I0308 03:40:32.372508 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854672de-d415-46d9-810b-5f7d085f1969-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:32.804237 master-0 kubenswrapper[13046]: I0308 03:40:32.803592 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" event={"ID":"854672de-d415-46d9-810b-5f7d085f1969","Type":"ContainerDied","Data":"9733682302f4d750b08b91d5414c3d3f749294935979d32e1113bde31ee86933"} Mar 08 03:40:32.804237 master-0 kubenswrapper[13046]: I0308 03:40:32.803636 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9733682302f4d750b08b91d5414c3d3f749294935979d32e1113bde31ee86933" Mar 08 03:40:32.804237 master-0 kubenswrapper[13046]: I0308 03:40:32.803728 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lnmrk" Mar 08 03:40:32.807791 master-0 kubenswrapper[13046]: I0308 03:40:32.807711 13046 generic.go:334] "Generic (PLEG): container finished" podID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerID="49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc" exitCode=143 Mar 08 03:40:32.808046 master-0 kubenswrapper[13046]: I0308 03:40:32.807798 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerDied","Data":"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc"} Mar 08 03:40:32.821095 master-0 kubenswrapper[13046]: I0308 03:40:32.820973 13046 generic.go:334] "Generic (PLEG): container finished" podID="1c41abde-9569-460b-9461-69eabb5eb006" containerID="41f768ca2eaf5193c192195022780bad5bc651063531cb75afa60ba1a45cf69d" exitCode=0 Mar 08 03:40:32.821095 master-0 kubenswrapper[13046]: I0308 03:40:32.821035 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c41abde-9569-460b-9461-69eabb5eb006","Type":"ContainerDied","Data":"41f768ca2eaf5193c192195022780bad5bc651063531cb75afa60ba1a45cf69d"} Mar 08 03:40:32.827085 master-0 kubenswrapper[13046]: I0308 03:40:32.827036 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: E0308 03:40:32.827664 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854672de-d415-46d9-810b-5f7d085f1969" containerName="nova-cell1-conductor-db-sync" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.827685 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="854672de-d415-46d9-810b-5f7d085f1969" containerName="nova-cell1-conductor-db-sync" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: E0308 03:40:32.827726 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="init" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.827733 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="init" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: E0308 03:40:32.827748 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bd0477-6b38-4d58-b53b-34c4c323496b" containerName="nova-manage" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.827754 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bd0477-6b38-4d58-b53b-34c4c323496b" containerName="nova-manage" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: E0308 03:40:32.827805 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="dnsmasq-dns" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.827815 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="dnsmasq-dns" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.828062 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c587ce-777a-463e-877b-ada22b560d4c" containerName="dnsmasq-dns" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.828124 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="854672de-d415-46d9-810b-5f7d085f1969" containerName="nova-cell1-conductor-db-sync" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.828138 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bd0477-6b38-4d58-b53b-34c4c323496b" containerName="nova-manage" Mar 08 03:40:32.830497 master-0 kubenswrapper[13046]: I0308 03:40:32.829138 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:32.837528 master-0 kubenswrapper[13046]: I0308 03:40:32.834892 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852706 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-log" containerID="cri-o://442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" gracePeriod=30 Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852842 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"e3d9520f0b8237900f373aff18a15d3a2287cf37a1c07bcdc58f5218236b93af"} Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852877 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852887 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852896 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"528b1064-a3b2-4ea4-8584-abeffdbedbbe","Type":"ContainerStarted","Data":"4322d2182b7d0519015727da1195c9849383c61e45e3e0d0186b89a9ab7284d3"} Mar 08 03:40:32.861050 master-0 kubenswrapper[13046]: I0308 03:40:32.852936 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-metadata" containerID="cri-o://0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" gracePeriod=30 Mar 08 03:40:32.867158 master-0 kubenswrapper[13046]: I0308 03:40:32.864363 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 03:40:32.894707 master-0 kubenswrapper[13046]: I0308 03:40:32.894631 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=71.980003752 podStartE2EDuration="1m47.894614705s" podCreationTimestamp="2026-03-08 03:38:45 +0000 UTC" firstStartedPulling="2026-03-08 03:38:56.367605064 +0000 UTC m=+1538.446372281" lastFinishedPulling="2026-03-08 03:39:32.282215977 +0000 UTC m=+1574.360983234" observedRunningTime="2026-03-08 03:40:32.881205895 +0000 UTC m=+1634.959973112" watchObservedRunningTime="2026-03-08 03:40:32.894614705 +0000 UTC m=+1634.973381922" Mar 08 03:40:32.947773 master-0 kubenswrapper[13046]: I0308 03:40:32.947732 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:32.984616 master-0 kubenswrapper[13046]: I0308 03:40:32.984566 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:40:32.984701 master-0 kubenswrapper[13046]: I0308 03:40:32.984632 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:40:32.999294 master-0 kubenswrapper[13046]: I0308 03:40:32.999240 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:32.999419 master-0 kubenswrapper[13046]: I0308 03:40:32.999394 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:32.999640 master-0 kubenswrapper[13046]: I0308 03:40:32.999620 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ql4r\" (UniqueName: \"kubernetes.io/projected/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-kube-api-access-4ql4r\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.102114 master-0 kubenswrapper[13046]: I0308 03:40:33.102051 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k7rw\" (UniqueName: \"kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw\") pod \"1c41abde-9569-460b-9461-69eabb5eb006\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " Mar 08 03:40:33.102321 master-0 kubenswrapper[13046]: I0308 03:40:33.102165 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle\") pod \"1c41abde-9569-460b-9461-69eabb5eb006\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " Mar 08 03:40:33.102321 master-0 kubenswrapper[13046]: I0308 03:40:33.102266 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data\") pod \"1c41abde-9569-460b-9461-69eabb5eb006\" (UID: \"1c41abde-9569-460b-9461-69eabb5eb006\") " Mar 08 03:40:33.103423 master-0 kubenswrapper[13046]: I0308 03:40:33.103375 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ql4r\" (UniqueName: \"kubernetes.io/projected/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-kube-api-access-4ql4r\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.103559 master-0 kubenswrapper[13046]: I0308 03:40:33.103528 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.103697 master-0 kubenswrapper[13046]: I0308 03:40:33.103664 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.105949 master-0 kubenswrapper[13046]: I0308 03:40:33.105902 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw" (OuterVolumeSpecName: "kube-api-access-8k7rw") pod "1c41abde-9569-460b-9461-69eabb5eb006" (UID: "1c41abde-9569-460b-9461-69eabb5eb006"). InnerVolumeSpecName "kube-api-access-8k7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:33.110255 master-0 kubenswrapper[13046]: I0308 03:40:33.110216 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.113913 master-0 kubenswrapper[13046]: I0308 03:40:33.113832 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.124617 master-0 kubenswrapper[13046]: I0308 03:40:33.124572 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ql4r\" (UniqueName: \"kubernetes.io/projected/efc783b5-a9ba-494e-8e7e-3a1e26d4194c-kube-api-access-4ql4r\") pod \"nova-cell1-conductor-0\" (UID: \"efc783b5-a9ba-494e-8e7e-3a1e26d4194c\") " pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.141161 master-0 kubenswrapper[13046]: I0308 03:40:33.140904 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data" (OuterVolumeSpecName: "config-data") pod "1c41abde-9569-460b-9461-69eabb5eb006" (UID: "1c41abde-9569-460b-9461-69eabb5eb006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:33.151138 master-0 kubenswrapper[13046]: I0308 03:40:33.150751 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c41abde-9569-460b-9461-69eabb5eb006" (UID: "1c41abde-9569-460b-9461-69eabb5eb006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:33.205528 master-0 kubenswrapper[13046]: I0308 03:40:33.205404 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k7rw\" (UniqueName: \"kubernetes.io/projected/1c41abde-9569-460b-9461-69eabb5eb006-kube-api-access-8k7rw\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.205528 master-0 kubenswrapper[13046]: I0308 03:40:33.205460 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.205528 master-0 kubenswrapper[13046]: I0308 03:40:33.205471 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c41abde-9569-460b-9461-69eabb5eb006-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.245587 master-0 kubenswrapper[13046]: I0308 03:40:33.244290 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:33.481469 master-0 kubenswrapper[13046]: I0308 03:40:33.478381 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 08 03:40:33.567409 master-0 kubenswrapper[13046]: I0308 03:40:33.567363 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:33.715939 master-0 kubenswrapper[13046]: I0308 03:40:33.715874 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data\") pod \"27bed797-75f3-4acf-9349-27eb09c2a7f6\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " Mar 08 03:40:33.716137 master-0 kubenswrapper[13046]: I0308 03:40:33.716005 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs\") pod \"27bed797-75f3-4acf-9349-27eb09c2a7f6\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " Mar 08 03:40:33.716137 master-0 kubenswrapper[13046]: I0308 03:40:33.716079 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle\") pod \"27bed797-75f3-4acf-9349-27eb09c2a7f6\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " Mar 08 03:40:33.716137 master-0 kubenswrapper[13046]: I0308 03:40:33.716125 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs\") pod \"27bed797-75f3-4acf-9349-27eb09c2a7f6\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " Mar 08 03:40:33.716241 master-0 kubenswrapper[13046]: I0308 03:40:33.716161 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk7fs\" (UniqueName: \"kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs\") pod \"27bed797-75f3-4acf-9349-27eb09c2a7f6\" (UID: \"27bed797-75f3-4acf-9349-27eb09c2a7f6\") " Mar 08 03:40:33.719966 master-0 kubenswrapper[13046]: I0308 03:40:33.719924 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs" (OuterVolumeSpecName: "logs") pod "27bed797-75f3-4acf-9349-27eb09c2a7f6" (UID: "27bed797-75f3-4acf-9349-27eb09c2a7f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:40:33.720449 master-0 kubenswrapper[13046]: I0308 03:40:33.720402 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs" (OuterVolumeSpecName: "kube-api-access-dk7fs") pod "27bed797-75f3-4acf-9349-27eb09c2a7f6" (UID: "27bed797-75f3-4acf-9349-27eb09c2a7f6"). InnerVolumeSpecName "kube-api-access-dk7fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:33.745011 master-0 kubenswrapper[13046]: I0308 03:40:33.744926 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27bed797-75f3-4acf-9349-27eb09c2a7f6" (UID: "27bed797-75f3-4acf-9349-27eb09c2a7f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:33.752199 master-0 kubenswrapper[13046]: I0308 03:40:33.752149 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data" (OuterVolumeSpecName: "config-data") pod "27bed797-75f3-4acf-9349-27eb09c2a7f6" (UID: "27bed797-75f3-4acf-9349-27eb09c2a7f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:33.786790 master-0 kubenswrapper[13046]: I0308 03:40:33.786740 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "27bed797-75f3-4acf-9349-27eb09c2a7f6" (UID: "27bed797-75f3-4acf-9349-27eb09c2a7f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:33.819010 master-0 kubenswrapper[13046]: I0308 03:40:33.818965 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.819010 master-0 kubenswrapper[13046]: I0308 03:40:33.819001 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27bed797-75f3-4acf-9349-27eb09c2a7f6-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.819010 master-0 kubenswrapper[13046]: I0308 03:40:33.819011 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.819010 master-0 kubenswrapper[13046]: I0308 03:40:33.819021 13046 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/27bed797-75f3-4acf-9349-27eb09c2a7f6-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.819683 master-0 kubenswrapper[13046]: I0308 03:40:33.819033 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk7fs\" (UniqueName: \"kubernetes.io/projected/27bed797-75f3-4acf-9349-27eb09c2a7f6-kube-api-access-dk7fs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:33.853464 master-0 kubenswrapper[13046]: W0308 03:40:33.853411 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc783b5_a9ba_494e_8e7e_3a1e26d4194c.slice/crio-ad311c1807af49da622388b958e2d3a6e34cb29c9b18953f181f05f2c758c1cc WatchSource:0}: Error finding container ad311c1807af49da622388b958e2d3a6e34cb29c9b18953f181f05f2c758c1cc: Status 404 returned error can't find the container with id ad311c1807af49da622388b958e2d3a6e34cb29c9b18953f181f05f2c758c1cc Mar 08 03:40:33.859580 master-0 kubenswrapper[13046]: I0308 03:40:33.858253 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864465 13046 generic.go:334] "Generic (PLEG): container finished" podID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerID="0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" exitCode=0 Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864505 13046 generic.go:334] "Generic (PLEG): container finished" podID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerID="442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" exitCode=143 Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864537 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerDied","Data":"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1"} Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864560 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerDied","Data":"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c"} Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864569 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"27bed797-75f3-4acf-9349-27eb09c2a7f6","Type":"ContainerDied","Data":"d773ebf3f298020972d5d7b606ac52e412e2f62233034c557c1a542c1ec3a6e0"} Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864584 13046 scope.go:117] "RemoveContainer" containerID="0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" Mar 08 03:40:33.867820 master-0 kubenswrapper[13046]: I0308 03:40:33.864687 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:33.869928 master-0 kubenswrapper[13046]: I0308 03:40:33.869049 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:33.872299 master-0 kubenswrapper[13046]: I0308 03:40:33.872203 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1c41abde-9569-460b-9461-69eabb5eb006","Type":"ContainerDied","Data":"611d5789c67e314393f9393a7c114b42be7c67c97a03086b40b819e8dcd516a1"} Mar 08 03:40:33.889775 master-0 kubenswrapper[13046]: I0308 03:40:33.889685 13046 scope.go:117] "RemoveContainer" containerID="442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" Mar 08 03:40:33.913011 master-0 kubenswrapper[13046]: I0308 03:40:33.912946 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:33.929632 master-0 kubenswrapper[13046]: I0308 03:40:33.929553 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:33.946591 master-0 kubenswrapper[13046]: I0308 03:40:33.945049 13046 scope.go:117] "RemoveContainer" containerID="0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" Mar 08 03:40:33.946591 master-0 kubenswrapper[13046]: E0308 03:40:33.946083 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1\": container with ID starting with 0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1 not found: ID does not exist" containerID="0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" Mar 08 03:40:33.946591 master-0 kubenswrapper[13046]: I0308 03:40:33.946113 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1"} err="failed to get container status \"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1\": rpc error: code = NotFound desc = could not find container \"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1\": container with ID starting with 0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1 not found: ID does not exist" Mar 08 03:40:33.946591 master-0 kubenswrapper[13046]: I0308 03:40:33.946135 13046 scope.go:117] "RemoveContainer" containerID="442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" Mar 08 03:40:33.948726 master-0 kubenswrapper[13046]: E0308 03:40:33.948677 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c\": container with ID starting with 442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c not found: ID does not exist" containerID="442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" Mar 08 03:40:33.948726 master-0 kubenswrapper[13046]: I0308 03:40:33.948714 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c"} err="failed to get container status \"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c\": rpc error: code = NotFound desc = could not find container \"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c\": container with ID starting with 442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c not found: ID does not exist" Mar 08 03:40:33.948873 master-0 kubenswrapper[13046]: I0308 03:40:33.948737 13046 scope.go:117] "RemoveContainer" containerID="0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1" Mar 08 03:40:33.952545 master-0 kubenswrapper[13046]: I0308 03:40:33.952496 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1"} err="failed to get container status \"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1\": rpc error: code = NotFound desc = could not find container \"0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1\": container with ID starting with 0669652cf00b1adcfa0764a20e396e17010a8bb559f77c1bd0c03f6033c911f1 not found: ID does not exist" Mar 08 03:40:33.952545 master-0 kubenswrapper[13046]: I0308 03:40:33.952538 13046 scope.go:117] "RemoveContainer" containerID="442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c" Mar 08 03:40:33.953664 master-0 kubenswrapper[13046]: I0308 03:40:33.953598 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c"} err="failed to get container status \"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c\": rpc error: code = NotFound desc = could not find container \"442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c\": container with ID starting with 442ca296bfef36501171baec4b87ab9af98d1c341f7c5b7773402ea91806a67c not found: ID does not exist" Mar 08 03:40:33.953734 master-0 kubenswrapper[13046]: I0308 03:40:33.953669 13046 scope.go:117] "RemoveContainer" containerID="41f768ca2eaf5193c192195022780bad5bc651063531cb75afa60ba1a45cf69d" Mar 08 03:40:33.956849 master-0 kubenswrapper[13046]: I0308 03:40:33.956814 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:33.987761 master-0 kubenswrapper[13046]: I0308 03:40:33.987625 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:34.027700 master-0 kubenswrapper[13046]: I0308 03:40:34.027613 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:34.028219 master-0 kubenswrapper[13046]: E0308 03:40:34.028195 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-log" Mar 08 03:40:34.028219 master-0 kubenswrapper[13046]: I0308 03:40:34.028215 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-log" Mar 08 03:40:34.028361 master-0 kubenswrapper[13046]: E0308 03:40:34.028244 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41abde-9569-460b-9461-69eabb5eb006" containerName="nova-scheduler-scheduler" Mar 08 03:40:34.028361 master-0 kubenswrapper[13046]: I0308 03:40:34.028251 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41abde-9569-460b-9461-69eabb5eb006" containerName="nova-scheduler-scheduler" Mar 08 03:40:34.028361 master-0 kubenswrapper[13046]: E0308 03:40:34.028265 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-metadata" Mar 08 03:40:34.028361 master-0 kubenswrapper[13046]: I0308 03:40:34.028273 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-metadata" Mar 08 03:40:34.028494 master-0 kubenswrapper[13046]: I0308 03:40:34.028469 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c41abde-9569-460b-9461-69eabb5eb006" containerName="nova-scheduler-scheduler" Mar 08 03:40:34.028533 master-0 kubenswrapper[13046]: I0308 03:40:34.028506 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-log" Mar 08 03:40:34.028599 master-0 kubenswrapper[13046]: I0308 03:40:34.028537 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" containerName="nova-metadata-metadata" Mar 08 03:40:34.030264 master-0 kubenswrapper[13046]: I0308 03:40:34.029282 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:34.032316 master-0 kubenswrapper[13046]: I0308 03:40:34.031742 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 03:40:34.041912 master-0 kubenswrapper[13046]: I0308 03:40:34.041822 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:34.052152 master-0 kubenswrapper[13046]: I0308 03:40:34.052097 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:34.057863 master-0 kubenswrapper[13046]: I0308 03:40:34.057766 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:34.062562 master-0 kubenswrapper[13046]: I0308 03:40:34.060785 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 03:40:34.062562 master-0 kubenswrapper[13046]: I0308 03:40:34.060945 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 03:40:34.064774 master-0 kubenswrapper[13046]: I0308 03:40:34.064740 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:34.125065 master-0 kubenswrapper[13046]: I0308 03:40:34.124990 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.125065 master-0 kubenswrapper[13046]: I0308 03:40:34.125029 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.125065 master-0 kubenswrapper[13046]: I0308 03:40:34.125054 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.125673 master-0 kubenswrapper[13046]: I0308 03:40:34.125080 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.125673 master-0 kubenswrapper[13046]: I0308 03:40:34.125110 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.125673 master-0 kubenswrapper[13046]: I0308 03:40:34.125158 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ftl\" (UniqueName: \"kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.125673 master-0 kubenswrapper[13046]: I0308 03:40:34.125179 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.125673 master-0 kubenswrapper[13046]: I0308 03:40:34.125204 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8zq5\" (UniqueName: \"kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.143716 master-0 kubenswrapper[13046]: I0308 03:40:34.143672 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c41abde-9569-460b-9461-69eabb5eb006" path="/var/lib/kubelet/pods/1c41abde-9569-460b-9461-69eabb5eb006/volumes" Mar 08 03:40:34.144724 master-0 kubenswrapper[13046]: I0308 03:40:34.144702 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27bed797-75f3-4acf-9349-27eb09c2a7f6" path="/var/lib/kubelet/pods/27bed797-75f3-4acf-9349-27eb09c2a7f6/volumes" Mar 08 03:40:34.227436 master-0 kubenswrapper[13046]: I0308 03:40:34.227384 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.227883 master-0 kubenswrapper[13046]: I0308 03:40:34.227839 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.227962 master-0 kubenswrapper[13046]: I0308 03:40:34.227883 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.227962 master-0 kubenswrapper[13046]: I0308 03:40:34.227929 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.228201 master-0 kubenswrapper[13046]: I0308 03:40:34.228175 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.228269 master-0 kubenswrapper[13046]: I0308 03:40:34.228242 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ftl\" (UniqueName: \"kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.228328 master-0 kubenswrapper[13046]: I0308 03:40:34.228271 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.228328 master-0 kubenswrapper[13046]: I0308 03:40:34.228297 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8zq5\" (UniqueName: \"kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.228801 master-0 kubenswrapper[13046]: I0308 03:40:34.228769 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.234646 master-0 kubenswrapper[13046]: I0308 03:40:34.234065 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.234646 master-0 kubenswrapper[13046]: I0308 03:40:34.234065 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.234646 master-0 kubenswrapper[13046]: I0308 03:40:34.234536 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.234646 master-0 kubenswrapper[13046]: I0308 03:40:34.234615 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.235571 master-0 kubenswrapper[13046]: I0308 03:40:34.235533 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.246305 master-0 kubenswrapper[13046]: I0308 03:40:34.246263 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ftl\" (UniqueName: \"kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl\") pod \"nova-metadata-0\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " pod="openstack/nova-metadata-0" Mar 08 03:40:34.247816 master-0 kubenswrapper[13046]: I0308 03:40:34.247755 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8zq5\" (UniqueName: \"kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5\") pod \"nova-scheduler-0\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " pod="openstack/nova-scheduler-0" Mar 08 03:40:34.355568 master-0 kubenswrapper[13046]: I0308 03:40:34.354920 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:40:34.381635 master-0 kubenswrapper[13046]: I0308 03:40:34.378046 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:40:34.852919 master-0 kubenswrapper[13046]: I0308 03:40:34.847600 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:40:34.897560 master-0 kubenswrapper[13046]: I0308 03:40:34.897459 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"efc783b5-a9ba-494e-8e7e-3a1e26d4194c","Type":"ContainerStarted","Data":"ed06f59c6d608a2a0c3c97160e442591fc2d3dd7209acb4cbee44b85f3c9f175"} Mar 08 03:40:34.897560 master-0 kubenswrapper[13046]: I0308 03:40:34.897554 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"efc783b5-a9ba-494e-8e7e-3a1e26d4194c","Type":"ContainerStarted","Data":"ad311c1807af49da622388b958e2d3a6e34cb29c9b18953f181f05f2c758c1cc"} Mar 08 03:40:34.899066 master-0 kubenswrapper[13046]: I0308 03:40:34.899024 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:34.902045 master-0 kubenswrapper[13046]: I0308 03:40:34.901895 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/ironic-conductor-0" podUID="528b1064-a3b2-4ea4-8584-abeffdbedbbe" containerName="ironic-conductor" probeResult="failure" output=< Mar 08 03:40:34.902045 master-0 kubenswrapper[13046]: ironic-conductor-0 is offline Mar 08 03:40:34.902045 master-0 kubenswrapper[13046]: > Mar 08 03:40:34.905420 master-0 kubenswrapper[13046]: I0308 03:40:34.905363 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b624d075-24f1-4e0a-8638-d4a694ab697f","Type":"ContainerStarted","Data":"5556917f01ce675bc3432988d09c68aff2243c0f1ea22d6bea40a4e8fa186f03"} Mar 08 03:40:34.945623 master-0 kubenswrapper[13046]: I0308 03:40:34.945498 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.945452661 podStartE2EDuration="2.945452661s" podCreationTimestamp="2026-03-08 03:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:34.93306955 +0000 UTC m=+1637.011836767" watchObservedRunningTime="2026-03-08 03:40:34.945452661 +0000 UTC m=+1637.024219878" Mar 08 03:40:34.984372 master-0 kubenswrapper[13046]: I0308 03:40:34.983873 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:40:35.275610 master-0 kubenswrapper[13046]: I0308 03:40:35.275563 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:35.359785 master-0 kubenswrapper[13046]: I0308 03:40:35.359685 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5tr7\" (UniqueName: \"kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7\") pod \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " Mar 08 03:40:35.360843 master-0 kubenswrapper[13046]: I0308 03:40:35.360033 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle\") pod \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " Mar 08 03:40:35.360843 master-0 kubenswrapper[13046]: I0308 03:40:35.360137 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data\") pod \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " Mar 08 03:40:35.360843 master-0 kubenswrapper[13046]: I0308 03:40:35.360181 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs\") pod \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\" (UID: \"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6\") " Mar 08 03:40:35.361155 master-0 kubenswrapper[13046]: I0308 03:40:35.360881 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs" (OuterVolumeSpecName: "logs") pod "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" (UID: "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:40:35.364556 master-0 kubenswrapper[13046]: I0308 03:40:35.363815 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7" (OuterVolumeSpecName: "kube-api-access-s5tr7") pod "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" (UID: "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6"). InnerVolumeSpecName "kube-api-access-s5tr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:35.398726 master-0 kubenswrapper[13046]: I0308 03:40:35.398667 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" (UID: "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:35.403861 master-0 kubenswrapper[13046]: I0308 03:40:35.403821 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data" (OuterVolumeSpecName: "config-data") pod "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" (UID: "6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:35.462616 master-0 kubenswrapper[13046]: I0308 03:40:35.462574 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:35.462616 master-0 kubenswrapper[13046]: I0308 03:40:35.462617 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:35.462757 master-0 kubenswrapper[13046]: I0308 03:40:35.462626 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:35.462757 master-0 kubenswrapper[13046]: I0308 03:40:35.462638 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5tr7\" (UniqueName: \"kubernetes.io/projected/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6-kube-api-access-s5tr7\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:35.927292 master-0 kubenswrapper[13046]: I0308 03:40:35.926945 13046 generic.go:334] "Generic (PLEG): container finished" podID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerID="8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda" exitCode=0 Mar 08 03:40:35.927292 master-0 kubenswrapper[13046]: I0308 03:40:35.927038 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerDied","Data":"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda"} Mar 08 03:40:35.927292 master-0 kubenswrapper[13046]: I0308 03:40:35.927063 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:35.927292 master-0 kubenswrapper[13046]: I0308 03:40:35.927074 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6","Type":"ContainerDied","Data":"c2c5dd1e6918cc6ca9ce732faeef9dccaae8b6516eba2fa1b01c3e5d4fde44ee"} Mar 08 03:40:35.927292 master-0 kubenswrapper[13046]: I0308 03:40:35.927088 13046 scope.go:117] "RemoveContainer" containerID="8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda" Mar 08 03:40:35.930567 master-0 kubenswrapper[13046]: I0308 03:40:35.930258 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b624d075-24f1-4e0a-8638-d4a694ab697f","Type":"ContainerStarted","Data":"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e"} Mar 08 03:40:35.942471 master-0 kubenswrapper[13046]: I0308 03:40:35.942311 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerStarted","Data":"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a"} Mar 08 03:40:35.942471 master-0 kubenswrapper[13046]: I0308 03:40:35.942394 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerStarted","Data":"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b"} Mar 08 03:40:35.942471 master-0 kubenswrapper[13046]: I0308 03:40:35.942412 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerStarted","Data":"db6a158a7eb9b6bf4968601d869ef4756e0040cf75c5b6afc2f678ae9ba76dea"} Mar 08 03:40:35.985803 master-0 kubenswrapper[13046]: I0308 03:40:35.984276 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9842562580000003 podStartE2EDuration="2.984256258s" podCreationTimestamp="2026-03-08 03:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:35.957264143 +0000 UTC m=+1638.036031380" watchObservedRunningTime="2026-03-08 03:40:35.984256258 +0000 UTC m=+1638.063023475" Mar 08 03:40:36.009910 master-0 kubenswrapper[13046]: I0308 03:40:36.009660 13046 scope.go:117] "RemoveContainer" containerID="49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: I0308 03:40:36.038655 13046 scope.go:117] "RemoveContainer" containerID="8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: E0308 03:40:36.039434 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda\": container with ID starting with 8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda not found: ID does not exist" containerID="8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: I0308 03:40:36.039512 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda"} err="failed to get container status \"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda\": rpc error: code = NotFound desc = could not find container \"8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda\": container with ID starting with 8858fcf3abd2ce0ebf7ec369744752a67c2e063006bf953d5b60c4f29fc82bda not found: ID does not exist" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: I0308 03:40:36.039549 13046 scope.go:117] "RemoveContainer" containerID="49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: I0308 03:40:36.041232 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.041208042 podStartE2EDuration="3.041208042s" podCreationTimestamp="2026-03-08 03:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:35.98856814 +0000 UTC m=+1638.067335367" watchObservedRunningTime="2026-03-08 03:40:36.041208042 +0000 UTC m=+1638.119975269" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: E0308 03:40:36.041681 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc\": container with ID starting with 49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc not found: ID does not exist" containerID="49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc" Mar 08 03:40:36.042606 master-0 kubenswrapper[13046]: I0308 03:40:36.041737 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc"} err="failed to get container status \"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc\": rpc error: code = NotFound desc = could not find container \"49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc\": container with ID starting with 49d53a4270407840b9e726df77fc43cdf43517e3ce4341d9788f977b945d12bc not found: ID does not exist" Mar 08 03:40:36.075529 master-0 kubenswrapper[13046]: I0308 03:40:36.075451 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:36.088031 master-0 kubenswrapper[13046]: I0308 03:40:36.087950 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:36.098757 master-0 kubenswrapper[13046]: I0308 03:40:36.098709 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:36.099401 master-0 kubenswrapper[13046]: E0308 03:40:36.099364 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-api" Mar 08 03:40:36.099401 master-0 kubenswrapper[13046]: I0308 03:40:36.099394 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-api" Mar 08 03:40:36.099528 master-0 kubenswrapper[13046]: E0308 03:40:36.099416 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-log" Mar 08 03:40:36.099528 master-0 kubenswrapper[13046]: I0308 03:40:36.099426 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-log" Mar 08 03:40:36.099865 master-0 kubenswrapper[13046]: I0308 03:40:36.099834 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-api" Mar 08 03:40:36.099930 master-0 kubenswrapper[13046]: I0308 03:40:36.099889 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" containerName="nova-api-log" Mar 08 03:40:36.101522 master-0 kubenswrapper[13046]: I0308 03:40:36.101497 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:36.104360 master-0 kubenswrapper[13046]: I0308 03:40:36.104263 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 03:40:36.111349 master-0 kubenswrapper[13046]: I0308 03:40:36.111287 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:36.136782 master-0 kubenswrapper[13046]: I0308 03:40:36.136715 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6" path="/var/lib/kubelet/pods/6db5a8dd-611d-4dfb-aebf-70bc9ddec1e6/volumes" Mar 08 03:40:36.184938 master-0 kubenswrapper[13046]: I0308 03:40:36.184812 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.185603 master-0 kubenswrapper[13046]: I0308 03:40:36.185548 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qthj\" (UniqueName: \"kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.186073 master-0 kubenswrapper[13046]: I0308 03:40:36.186040 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.186460 master-0 kubenswrapper[13046]: I0308 03:40:36.186427 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.289326 master-0 kubenswrapper[13046]: I0308 03:40:36.289260 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.289588 master-0 kubenswrapper[13046]: I0308 03:40:36.289343 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.289588 master-0 kubenswrapper[13046]: I0308 03:40:36.289460 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qthj\" (UniqueName: \"kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.289588 master-0 kubenswrapper[13046]: I0308 03:40:36.289515 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.292572 master-0 kubenswrapper[13046]: I0308 03:40:36.292394 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.294716 master-0 kubenswrapper[13046]: I0308 03:40:36.294689 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.305198 master-0 kubenswrapper[13046]: I0308 03:40:36.305162 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.309432 master-0 kubenswrapper[13046]: I0308 03:40:36.309375 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qthj\" (UniqueName: \"kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj\") pod \"nova-api-0\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " pod="openstack/nova-api-0" Mar 08 03:40:36.422692 master-0 kubenswrapper[13046]: I0308 03:40:36.422531 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:40:36.754647 master-0 kubenswrapper[13046]: I0308 03:40:36.754503 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 08 03:40:36.802798 master-0 kubenswrapper[13046]: I0308 03:40:36.802712 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 03:40:37.041227 master-0 kubenswrapper[13046]: I0308 03:40:37.041089 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:40:37.969960 master-0 kubenswrapper[13046]: I0308 03:40:37.969844 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerStarted","Data":"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e"} Mar 08 03:40:37.969960 master-0 kubenswrapper[13046]: I0308 03:40:37.969948 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerStarted","Data":"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f"} Mar 08 03:40:37.970252 master-0 kubenswrapper[13046]: I0308 03:40:37.969972 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerStarted","Data":"399b3b91a858d06d66fd571a074d0b0f626a3418cf15be1f00e7aeaab50c15df"} Mar 08 03:40:38.013731 master-0 kubenswrapper[13046]: I0308 03:40:38.013615 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.013583895 podStartE2EDuration="2.013583895s" podCreationTimestamp="2026-03-08 03:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:37.997258262 +0000 UTC m=+1640.076025549" watchObservedRunningTime="2026-03-08 03:40:38.013583895 +0000 UTC m=+1640.092351142" Mar 08 03:40:39.356099 master-0 kubenswrapper[13046]: I0308 03:40:39.355970 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 03:40:39.378747 master-0 kubenswrapper[13046]: I0308 03:40:39.378669 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:40:39.378747 master-0 kubenswrapper[13046]: I0308 03:40:39.378746 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:40:43.291791 master-0 kubenswrapper[13046]: I0308 03:40:43.291746 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 03:40:44.356747 master-0 kubenswrapper[13046]: I0308 03:40:44.356685 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 03:40:44.378572 master-0 kubenswrapper[13046]: I0308 03:40:44.378522 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 03:40:44.378776 master-0 kubenswrapper[13046]: I0308 03:40:44.378705 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 03:40:44.387174 master-0 kubenswrapper[13046]: I0308 03:40:44.387130 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 03:40:45.182938 master-0 kubenswrapper[13046]: I0308 03:40:45.181268 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 03:40:45.390723 master-0 kubenswrapper[13046]: I0308 03:40:45.390616 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:45.390723 master-0 kubenswrapper[13046]: I0308 03:40:45.390662 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:46.423200 master-0 kubenswrapper[13046]: I0308 03:40:46.423033 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:40:46.423200 master-0 kubenswrapper[13046]: I0308 03:40:46.423164 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:40:47.505690 master-0 kubenswrapper[13046]: I0308 03:40:47.505617 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:47.506238 master-0 kubenswrapper[13046]: I0308 03:40:47.505923 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:40:49.480114 master-0 kubenswrapper[13046]: I0308 03:40:49.480040 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 08 03:40:52.011800 master-0 kubenswrapper[13046]: I0308 03:40:52.011726 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.099566 master-0 kubenswrapper[13046]: I0308 03:40:52.099476 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle\") pod \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " Mar 08 03:40:52.099813 master-0 kubenswrapper[13046]: I0308 03:40:52.099701 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55lbh\" (UniqueName: \"kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh\") pod \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " Mar 08 03:40:52.100517 master-0 kubenswrapper[13046]: I0308 03:40:52.100495 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data\") pod \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\" (UID: \"223afba7-1a3a-4d5e-b512-ecafdccc5dab\") " Mar 08 03:40:52.103064 master-0 kubenswrapper[13046]: I0308 03:40:52.103027 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh" (OuterVolumeSpecName: "kube-api-access-55lbh") pod "223afba7-1a3a-4d5e-b512-ecafdccc5dab" (UID: "223afba7-1a3a-4d5e-b512-ecafdccc5dab"). InnerVolumeSpecName "kube-api-access-55lbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:40:52.134607 master-0 kubenswrapper[13046]: I0308 03:40:52.134552 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data" (OuterVolumeSpecName: "config-data") pod "223afba7-1a3a-4d5e-b512-ecafdccc5dab" (UID: "223afba7-1a3a-4d5e-b512-ecafdccc5dab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:52.143051 master-0 kubenswrapper[13046]: I0308 03:40:52.143003 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "223afba7-1a3a-4d5e-b512-ecafdccc5dab" (UID: "223afba7-1a3a-4d5e-b512-ecafdccc5dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:40:52.206175 master-0 kubenswrapper[13046]: I0308 03:40:52.205673 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:52.206175 master-0 kubenswrapper[13046]: I0308 03:40:52.205733 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223afba7-1a3a-4d5e-b512-ecafdccc5dab-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:52.206175 master-0 kubenswrapper[13046]: I0308 03:40:52.205793 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55lbh\" (UniqueName: \"kubernetes.io/projected/223afba7-1a3a-4d5e-b512-ecafdccc5dab-kube-api-access-55lbh\") on node \"master-0\" DevicePath \"\"" Mar 08 03:40:52.265861 master-0 kubenswrapper[13046]: I0308 03:40:52.265787 13046 generic.go:334] "Generic (PLEG): container finished" podID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" containerID="944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725" exitCode=137 Mar 08 03:40:52.266160 master-0 kubenswrapper[13046]: I0308 03:40:52.265854 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.266160 master-0 kubenswrapper[13046]: I0308 03:40:52.265858 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"223afba7-1a3a-4d5e-b512-ecafdccc5dab","Type":"ContainerDied","Data":"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725"} Mar 08 03:40:52.266160 master-0 kubenswrapper[13046]: I0308 03:40:52.266015 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"223afba7-1a3a-4d5e-b512-ecafdccc5dab","Type":"ContainerDied","Data":"5c3461b6cc6c5c6c779123a8afe3f1cf9a2f8eb16a8ade241c5f79326bf4f373"} Mar 08 03:40:52.266160 master-0 kubenswrapper[13046]: I0308 03:40:52.266049 13046 scope.go:117] "RemoveContainer" containerID="944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725" Mar 08 03:40:52.312828 master-0 kubenswrapper[13046]: I0308 03:40:52.312761 13046 scope.go:117] "RemoveContainer" containerID="944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725" Mar 08 03:40:52.313405 master-0 kubenswrapper[13046]: E0308 03:40:52.313335 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725\": container with ID starting with 944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725 not found: ID does not exist" containerID="944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725" Mar 08 03:40:52.313560 master-0 kubenswrapper[13046]: I0308 03:40:52.313397 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725"} err="failed to get container status \"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725\": rpc error: code = NotFound desc = could not find container \"944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725\": container with ID starting with 944dace8ebbce0b710a80fa6786911c4c6d1d09f15e02e19a54ffba506a7e725 not found: ID does not exist" Mar 08 03:40:52.335201 master-0 kubenswrapper[13046]: I0308 03:40:52.335122 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:52.353636 master-0 kubenswrapper[13046]: I0308 03:40:52.353512 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:52.372110 master-0 kubenswrapper[13046]: I0308 03:40:52.372057 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:52.374568 master-0 kubenswrapper[13046]: E0308 03:40:52.373141 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 03:40:52.374863 master-0 kubenswrapper[13046]: I0308 03:40:52.374795 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 03:40:52.375884 master-0 kubenswrapper[13046]: I0308 03:40:52.375822 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 03:40:52.377693 master-0 kubenswrapper[13046]: I0308 03:40:52.377636 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.384845 master-0 kubenswrapper[13046]: I0308 03:40:52.384779 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:52.386555 master-0 kubenswrapper[13046]: I0308 03:40:52.385631 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 03:40:52.386555 master-0 kubenswrapper[13046]: I0308 03:40:52.385894 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 03:40:52.386555 master-0 kubenswrapper[13046]: I0308 03:40:52.386020 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 03:40:52.520109 master-0 kubenswrapper[13046]: I0308 03:40:52.519966 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.520109 master-0 kubenswrapper[13046]: I0308 03:40:52.520032 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.520109 master-0 kubenswrapper[13046]: I0308 03:40:52.520057 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rzsj\" (UniqueName: \"kubernetes.io/projected/00cccaa9-e6cb-443c-ba0b-5477870e47be-kube-api-access-7rzsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.520410 master-0 kubenswrapper[13046]: I0308 03:40:52.520181 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.520410 master-0 kubenswrapper[13046]: I0308 03:40:52.520238 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.623203 master-0 kubenswrapper[13046]: I0308 03:40:52.623142 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.623839 master-0 kubenswrapper[13046]: I0308 03:40:52.623544 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rzsj\" (UniqueName: \"kubernetes.io/projected/00cccaa9-e6cb-443c-ba0b-5477870e47be-kube-api-access-7rzsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.624027 master-0 kubenswrapper[13046]: I0308 03:40:52.623929 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.624027 master-0 kubenswrapper[13046]: I0308 03:40:52.624004 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.624128 master-0 kubenswrapper[13046]: I0308 03:40:52.624057 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.627941 master-0 kubenswrapper[13046]: I0308 03:40:52.627730 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.627941 master-0 kubenswrapper[13046]: I0308 03:40:52.627830 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.628796 master-0 kubenswrapper[13046]: I0308 03:40:52.628581 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.630200 master-0 kubenswrapper[13046]: I0308 03:40:52.630115 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00cccaa9-e6cb-443c-ba0b-5477870e47be-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.647108 master-0 kubenswrapper[13046]: I0308 03:40:52.646269 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rzsj\" (UniqueName: \"kubernetes.io/projected/00cccaa9-e6cb-443c-ba0b-5477870e47be-kube-api-access-7rzsj\") pod \"nova-cell1-novncproxy-0\" (UID: \"00cccaa9-e6cb-443c-ba0b-5477870e47be\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:52.724513 master-0 kubenswrapper[13046]: I0308 03:40:52.724416 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:53.185721 master-0 kubenswrapper[13046]: W0308 03:40:53.185625 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00cccaa9_e6cb_443c_ba0b_5477870e47be.slice/crio-fd8871ecb6ef24c50599a1a3961dbda3ad1ff4dfea92fe19cf4c80b778a8153e WatchSource:0}: Error finding container fd8871ecb6ef24c50599a1a3961dbda3ad1ff4dfea92fe19cf4c80b778a8153e: Status 404 returned error can't find the container with id fd8871ecb6ef24c50599a1a3961dbda3ad1ff4dfea92fe19cf4c80b778a8153e Mar 08 03:40:53.190996 master-0 kubenswrapper[13046]: I0308 03:40:53.190953 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 03:40:53.285526 master-0 kubenswrapper[13046]: I0308 03:40:53.285183 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00cccaa9-e6cb-443c-ba0b-5477870e47be","Type":"ContainerStarted","Data":"fd8871ecb6ef24c50599a1a3961dbda3ad1ff4dfea92fe19cf4c80b778a8153e"} Mar 08 03:40:54.147097 master-0 kubenswrapper[13046]: I0308 03:40:54.146235 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223afba7-1a3a-4d5e-b512-ecafdccc5dab" path="/var/lib/kubelet/pods/223afba7-1a3a-4d5e-b512-ecafdccc5dab/volumes" Mar 08 03:40:54.296725 master-0 kubenswrapper[13046]: I0308 03:40:54.296325 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00cccaa9-e6cb-443c-ba0b-5477870e47be","Type":"ContainerStarted","Data":"6fd1e8e86c8dd4d83c53206152c59d7e9355052cbb41b5ce2f8846dffb2d4a07"} Mar 08 03:40:54.335129 master-0 kubenswrapper[13046]: I0308 03:40:54.335039 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.335017267 podStartE2EDuration="2.335017267s" podCreationTimestamp="2026-03-08 03:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:40:54.31996771 +0000 UTC m=+1656.398734937" watchObservedRunningTime="2026-03-08 03:40:54.335017267 +0000 UTC m=+1656.413784484" Mar 08 03:40:54.389954 master-0 kubenswrapper[13046]: I0308 03:40:54.386547 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 03:40:54.389954 master-0 kubenswrapper[13046]: I0308 03:40:54.387020 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 03:40:54.395501 master-0 kubenswrapper[13046]: I0308 03:40:54.394658 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 03:40:55.318385 master-0 kubenswrapper[13046]: I0308 03:40:55.318330 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 03:40:56.426627 master-0 kubenswrapper[13046]: I0308 03:40:56.426558 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 03:40:56.427729 master-0 kubenswrapper[13046]: I0308 03:40:56.427693 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 03:40:56.428002 master-0 kubenswrapper[13046]: I0308 03:40:56.427974 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 03:40:56.431780 master-0 kubenswrapper[13046]: I0308 03:40:56.431721 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 03:40:57.341319 master-0 kubenswrapper[13046]: I0308 03:40:57.341227 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 03:40:57.347045 master-0 kubenswrapper[13046]: I0308 03:40:57.346967 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 03:40:57.726870 master-0 kubenswrapper[13046]: I0308 03:40:57.726819 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:40:57.776472 master-0 kubenswrapper[13046]: I0308 03:40:57.776368 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74d79d4489-nq9k9"] Mar 08 03:40:57.779408 master-0 kubenswrapper[13046]: I0308 03:40:57.779367 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.787577 master-0 kubenswrapper[13046]: I0308 03:40:57.787503 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d79d4489-nq9k9"] Mar 08 03:40:57.860865 master-0 kubenswrapper[13046]: I0308 03:40:57.860666 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbpx\" (UniqueName: \"kubernetes.io/projected/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-kube-api-access-9tbpx\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.861073 master-0 kubenswrapper[13046]: I0308 03:40:57.860916 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-svc\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.861119 master-0 kubenswrapper[13046]: I0308 03:40:57.861102 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.861152 master-0 kubenswrapper[13046]: I0308 03:40:57.861132 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-config\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.861191 master-0 kubenswrapper[13046]: I0308 03:40:57.861179 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.861229 master-0 kubenswrapper[13046]: I0308 03:40:57.861207 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.963887 master-0 kubenswrapper[13046]: I0308 03:40:57.963763 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.963887 master-0 kubenswrapper[13046]: I0308 03:40:57.963837 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-config\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.964116 master-0 kubenswrapper[13046]: I0308 03:40:57.963903 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.964116 master-0 kubenswrapper[13046]: I0308 03:40:57.963943 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.964116 master-0 kubenswrapper[13046]: I0308 03:40:57.963988 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbpx\" (UniqueName: \"kubernetes.io/projected/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-kube-api-access-9tbpx\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.964116 master-0 kubenswrapper[13046]: I0308 03:40:57.964013 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-svc\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.965148 master-0 kubenswrapper[13046]: I0308 03:40:57.965125 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-svc\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.966351 master-0 kubenswrapper[13046]: I0308 03:40:57.966326 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-config\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.966532 master-0 kubenswrapper[13046]: I0308 03:40:57.966499 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.966740 master-0 kubenswrapper[13046]: I0308 03:40:57.966718 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.967225 master-0 kubenswrapper[13046]: I0308 03:40:57.967206 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:57.982609 master-0 kubenswrapper[13046]: I0308 03:40:57.982518 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbpx\" (UniqueName: \"kubernetes.io/projected/2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a-kube-api-access-9tbpx\") pod \"dnsmasq-dns-74d79d4489-nq9k9\" (UID: \"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a\") " pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:58.104244 master-0 kubenswrapper[13046]: I0308 03:40:58.104169 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:40:58.629743 master-0 kubenswrapper[13046]: I0308 03:40:58.629689 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d79d4489-nq9k9"] Mar 08 03:40:59.391390 master-0 kubenswrapper[13046]: I0308 03:40:59.391336 13046 generic.go:334] "Generic (PLEG): container finished" podID="2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a" containerID="d61c2752e14de016f07174a5dbe630bf573d854384924295ef6f9d6d040e1c19" exitCode=0 Mar 08 03:40:59.391969 master-0 kubenswrapper[13046]: I0308 03:40:59.391869 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" event={"ID":"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a","Type":"ContainerDied","Data":"d61c2752e14de016f07174a5dbe630bf573d854384924295ef6f9d6d040e1c19"} Mar 08 03:40:59.391969 master-0 kubenswrapper[13046]: I0308 03:40:59.391920 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" event={"ID":"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a","Type":"ContainerStarted","Data":"b532f6ceac73e039c733ae4e616a7c74c8d6f9a9765ae69e5c578808225d1772"} Mar 08 03:41:00.406412 master-0 kubenswrapper[13046]: I0308 03:41:00.406365 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" event={"ID":"2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a","Type":"ContainerStarted","Data":"59cb1871cb6df755d11b479c1106dbfc9dece401007fba3bb43a0d3d77796419"} Mar 08 03:41:00.406922 master-0 kubenswrapper[13046]: I0308 03:41:00.406643 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:41:00.415702 master-0 kubenswrapper[13046]: I0308 03:41:00.415638 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:00.416035 master-0 kubenswrapper[13046]: I0308 03:41:00.415985 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-log" containerID="cri-o://60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f" gracePeriod=30 Mar 08 03:41:00.416092 master-0 kubenswrapper[13046]: I0308 03:41:00.416046 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-api" containerID="cri-o://4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e" gracePeriod=30 Mar 08 03:41:00.471619 master-0 kubenswrapper[13046]: I0308 03:41:00.471341 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" podStartSLOduration=3.471321566 podStartE2EDuration="3.471321566s" podCreationTimestamp="2026-03-08 03:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:00.448895161 +0000 UTC m=+1662.527662378" watchObservedRunningTime="2026-03-08 03:41:00.471321566 +0000 UTC m=+1662.550088783" Mar 08 03:41:01.422023 master-0 kubenswrapper[13046]: I0308 03:41:01.421947 13046 generic.go:334] "Generic (PLEG): container finished" podID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerID="60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f" exitCode=143 Mar 08 03:41:01.422647 master-0 kubenswrapper[13046]: I0308 03:41:01.422029 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerDied","Data":"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f"} Mar 08 03:41:02.745111 master-0 kubenswrapper[13046]: I0308 03:41:02.745037 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:41:02.768283 master-0 kubenswrapper[13046]: I0308 03:41:02.768221 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:41:03.473627 master-0 kubenswrapper[13046]: I0308 03:41:03.473471 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 03:41:03.734344 master-0 kubenswrapper[13046]: I0308 03:41:03.734037 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8n5sd"] Mar 08 03:41:03.743115 master-0 kubenswrapper[13046]: I0308 03:41:03.742854 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-jgj9r"] Mar 08 03:41:03.743599 master-0 kubenswrapper[13046]: I0308 03:41:03.743580 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.757532 master-0 kubenswrapper[13046]: I0308 03:41:03.748079 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 03:41:03.757532 master-0 kubenswrapper[13046]: I0308 03:41:03.748234 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.757532 master-0 kubenswrapper[13046]: I0308 03:41:03.748272 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 03:41:03.789040 master-0 kubenswrapper[13046]: I0308 03:41:03.788971 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8n5sd"] Mar 08 03:41:03.805649 master-0 kubenswrapper[13046]: I0308 03:41:03.805616 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-jgj9r"] Mar 08 03:41:03.817789 master-0 kubenswrapper[13046]: I0308 03:41:03.817758 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.817940 master-0 kubenswrapper[13046]: I0308 03:41:03.817921 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfkqr\" (UniqueName: \"kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.818134 master-0 kubenswrapper[13046]: I0308 03:41:03.818117 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.818221 master-0 kubenswrapper[13046]: I0308 03:41:03.818209 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55p2g\" (UniqueName: \"kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.818310 master-0 kubenswrapper[13046]: I0308 03:41:03.818298 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.818419 master-0 kubenswrapper[13046]: I0308 03:41:03.818404 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.818541 master-0 kubenswrapper[13046]: I0308 03:41:03.818527 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.818644 master-0 kubenswrapper[13046]: I0308 03:41:03.818631 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.926534 master-0 kubenswrapper[13046]: I0308 03:41:03.926458 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.926731 master-0 kubenswrapper[13046]: I0308 03:41:03.926566 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfkqr\" (UniqueName: \"kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.926831 master-0 kubenswrapper[13046]: I0308 03:41:03.926801 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.926886 master-0 kubenswrapper[13046]: I0308 03:41:03.926848 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55p2g\" (UniqueName: \"kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.926941 master-0 kubenswrapper[13046]: I0308 03:41:03.926918 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.927016 master-0 kubenswrapper[13046]: I0308 03:41:03.926994 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.927250 master-0 kubenswrapper[13046]: I0308 03:41:03.927191 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.927300 master-0 kubenswrapper[13046]: I0308 03:41:03.927274 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.931208 master-0 kubenswrapper[13046]: I0308 03:41:03.931175 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.932378 master-0 kubenswrapper[13046]: I0308 03:41:03.932341 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.934335 master-0 kubenswrapper[13046]: I0308 03:41:03.934294 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.937582 master-0 kubenswrapper[13046]: I0308 03:41:03.937474 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.938877 master-0 kubenswrapper[13046]: I0308 03:41:03.938813 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.953787 master-0 kubenswrapper[13046]: I0308 03:41:03.953688 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:03.962218 master-0 kubenswrapper[13046]: I0308 03:41:03.962171 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55p2g\" (UniqueName: \"kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g\") pod \"nova-cell1-host-discover-jgj9r\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:03.966696 master-0 kubenswrapper[13046]: I0308 03:41:03.966652 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfkqr\" (UniqueName: \"kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr\") pod \"nova-cell1-cell-mapping-8n5sd\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:04.079928 master-0 kubenswrapper[13046]: I0308 03:41:04.079798 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:04.102652 master-0 kubenswrapper[13046]: I0308 03:41:04.102583 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:04.249026 master-0 kubenswrapper[13046]: I0308 03:41:04.248333 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:04.337443 master-0 kubenswrapper[13046]: I0308 03:41:04.337339 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data\") pod \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " Mar 08 03:41:04.337678 master-0 kubenswrapper[13046]: I0308 03:41:04.337598 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle\") pod \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " Mar 08 03:41:04.337678 master-0 kubenswrapper[13046]: I0308 03:41:04.337664 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qthj\" (UniqueName: \"kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj\") pod \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " Mar 08 03:41:04.337779 master-0 kubenswrapper[13046]: I0308 03:41:04.337737 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs\") pod \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\" (UID: \"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb\") " Mar 08 03:41:04.348012 master-0 kubenswrapper[13046]: I0308 03:41:04.346421 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs" (OuterVolumeSpecName: "logs") pod "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" (UID: "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:41:04.348012 master-0 kubenswrapper[13046]: I0308 03:41:04.346473 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj" (OuterVolumeSpecName: "kube-api-access-9qthj") pod "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" (UID: "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb"). InnerVolumeSpecName "kube-api-access-9qthj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:04.377560 master-0 kubenswrapper[13046]: I0308 03:41:04.377496 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data" (OuterVolumeSpecName: "config-data") pod "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" (UID: "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:04.381030 master-0 kubenswrapper[13046]: I0308 03:41:04.379622 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" (UID: "85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:04.449929 master-0 kubenswrapper[13046]: I0308 03:41:04.449875 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:04.449929 master-0 kubenswrapper[13046]: I0308 03:41:04.449921 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:04.449929 master-0 kubenswrapper[13046]: I0308 03:41:04.449933 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qthj\" (UniqueName: \"kubernetes.io/projected/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-kube-api-access-9qthj\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:04.449929 master-0 kubenswrapper[13046]: I0308 03:41:04.449943 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:04.462296 master-0 kubenswrapper[13046]: I0308 03:41:04.462249 13046 generic.go:334] "Generic (PLEG): container finished" podID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerID="4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e" exitCode=0 Mar 08 03:41:04.462519 master-0 kubenswrapper[13046]: I0308 03:41:04.462313 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:04.462519 master-0 kubenswrapper[13046]: I0308 03:41:04.462357 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerDied","Data":"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e"} Mar 08 03:41:04.462519 master-0 kubenswrapper[13046]: I0308 03:41:04.462435 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb","Type":"ContainerDied","Data":"399b3b91a858d06d66fd571a074d0b0f626a3418cf15be1f00e7aeaab50c15df"} Mar 08 03:41:04.462519 master-0 kubenswrapper[13046]: I0308 03:41:04.462460 13046 scope.go:117] "RemoveContainer" containerID="4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e" Mar 08 03:41:04.521104 master-0 kubenswrapper[13046]: I0308 03:41:04.520714 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:04.542946 master-0 kubenswrapper[13046]: I0308 03:41:04.541530 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:04.553989 master-0 kubenswrapper[13046]: I0308 03:41:04.548064 13046 scope.go:117] "RemoveContainer" containerID="60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.559138 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: E0308 03:41:04.559658 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-api" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.559674 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-api" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: E0308 03:41:04.559713 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-log" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.559719 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-log" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.559976 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-log" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.560004 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" containerName="nova-api-api" Mar 08 03:41:04.565669 master-0 kubenswrapper[13046]: I0308 03:41:04.561762 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:04.569662 master-0 kubenswrapper[13046]: I0308 03:41:04.569620 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 03:41:04.569662 master-0 kubenswrapper[13046]: I0308 03:41:04.569635 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 03:41:04.569999 master-0 kubenswrapper[13046]: I0308 03:41:04.569974 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 03:41:04.582397 master-0 kubenswrapper[13046]: I0308 03:41:04.582161 13046 scope.go:117] "RemoveContainer" containerID="4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e" Mar 08 03:41:04.584664 master-0 kubenswrapper[13046]: E0308 03:41:04.584003 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e\": container with ID starting with 4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e not found: ID does not exist" containerID="4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e" Mar 08 03:41:04.584664 master-0 kubenswrapper[13046]: I0308 03:41:04.584042 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e"} err="failed to get container status \"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e\": rpc error: code = NotFound desc = could not find container \"4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e\": container with ID starting with 4c02cc1f4083d0d9ddadb173551e3edb092d9c949a4759b87a3bd4eea14c748e not found: ID does not exist" Mar 08 03:41:04.584664 master-0 kubenswrapper[13046]: I0308 03:41:04.584066 13046 scope.go:117] "RemoveContainer" containerID="60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f" Mar 08 03:41:04.587040 master-0 kubenswrapper[13046]: E0308 03:41:04.586955 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f\": container with ID starting with 60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f not found: ID does not exist" containerID="60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f" Mar 08 03:41:04.587040 master-0 kubenswrapper[13046]: I0308 03:41:04.587011 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f"} err="failed to get container status \"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f\": rpc error: code = NotFound desc = could not find container \"60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f\": container with ID starting with 60c75b828636e183755c258ef4deb4d840eba281829c0a27bf5de91d32ba221f not found: ID does not exist" Mar 08 03:41:04.623592 master-0 kubenswrapper[13046]: I0308 03:41:04.622595 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.656751 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.656811 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmdd\" (UniqueName: \"kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.656889 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.656953 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.656975 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.664516 master-0 kubenswrapper[13046]: I0308 03:41:04.657008 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.675084 master-0 kubenswrapper[13046]: W0308 03:41:04.675015 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73db713_dc45_42da_a26c_a6cf8d83821a.slice/crio-744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8 WatchSource:0}: Error finding container 744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8: Status 404 returned error can't find the container with id 744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8 Mar 08 03:41:04.703657 master-0 kubenswrapper[13046]: I0308 03:41:04.703347 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8n5sd"] Mar 08 03:41:04.759743 master-0 kubenswrapper[13046]: I0308 03:41:04.759286 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.759743 master-0 kubenswrapper[13046]: I0308 03:41:04.759610 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knmdd\" (UniqueName: \"kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.761092 master-0 kubenswrapper[13046]: I0308 03:41:04.759879 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.761092 master-0 kubenswrapper[13046]: I0308 03:41:04.759960 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.761092 master-0 kubenswrapper[13046]: I0308 03:41:04.760117 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.761092 master-0 kubenswrapper[13046]: I0308 03:41:04.760222 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.761092 master-0 kubenswrapper[13046]: I0308 03:41:04.760321 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.763658 master-0 kubenswrapper[13046]: I0308 03:41:04.763629 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.764961 master-0 kubenswrapper[13046]: I0308 03:41:04.764727 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.764961 master-0 kubenswrapper[13046]: I0308 03:41:04.764725 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.775566 master-0 kubenswrapper[13046]: I0308 03:41:04.773945 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.777437 master-0 kubenswrapper[13046]: I0308 03:41:04.777397 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knmdd\" (UniqueName: \"kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd\") pod \"nova-api-0\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " pod="openstack/nova-api-0" Mar 08 03:41:04.826079 master-0 kubenswrapper[13046]: I0308 03:41:04.826039 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-jgj9r"] Mar 08 03:41:04.883017 master-0 kubenswrapper[13046]: I0308 03:41:04.882896 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:05.396230 master-0 kubenswrapper[13046]: W0308 03:41:05.396146 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a8dba9_1bc5_4002_9dea_7cc4eff36e16.slice/crio-766d5402938fb6b33768cd5d986284408257d12587afa8b9871d8b48a3ec8210 WatchSource:0}: Error finding container 766d5402938fb6b33768cd5d986284408257d12587afa8b9871d8b48a3ec8210: Status 404 returned error can't find the container with id 766d5402938fb6b33768cd5d986284408257d12587afa8b9871d8b48a3ec8210 Mar 08 03:41:05.421902 master-0 kubenswrapper[13046]: I0308 03:41:05.420942 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:05.483531 master-0 kubenswrapper[13046]: I0308 03:41:05.483434 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerStarted","Data":"766d5402938fb6b33768cd5d986284408257d12587afa8b9871d8b48a3ec8210"} Mar 08 03:41:05.485202 master-0 kubenswrapper[13046]: I0308 03:41:05.485124 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8n5sd" event={"ID":"c73db713-dc45-42da-a26c-a6cf8d83821a","Type":"ContainerStarted","Data":"aec03aeaa079dff2f7d1817269e1f7bd4840803da8e785309968031d87af9708"} Mar 08 03:41:05.485315 master-0 kubenswrapper[13046]: I0308 03:41:05.485206 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8n5sd" event={"ID":"c73db713-dc45-42da-a26c-a6cf8d83821a","Type":"ContainerStarted","Data":"744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8"} Mar 08 03:41:05.486432 master-0 kubenswrapper[13046]: I0308 03:41:05.486388 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-jgj9r" event={"ID":"65bdf11b-0b56-43f0-8eb7-e15efda5ae67","Type":"ContainerStarted","Data":"297f43b318034f13eb7bc5d70e2e65b057d40949d2c0f3a73b72abdc005a922b"} Mar 08 03:41:05.486432 master-0 kubenswrapper[13046]: I0308 03:41:05.486424 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-jgj9r" event={"ID":"65bdf11b-0b56-43f0-8eb7-e15efda5ae67","Type":"ContainerStarted","Data":"a953b221132e824fca3c5a13c404ff83ffc7989b9029ec30b1ff5283fb946d43"} Mar 08 03:41:05.537650 master-0 kubenswrapper[13046]: I0308 03:41:05.537539 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-jgj9r" podStartSLOduration=2.537517941 podStartE2EDuration="2.537517941s" podCreationTimestamp="2026-03-08 03:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:05.535978527 +0000 UTC m=+1667.614745754" watchObservedRunningTime="2026-03-08 03:41:05.537517941 +0000 UTC m=+1667.616285168" Mar 08 03:41:05.539381 master-0 kubenswrapper[13046]: I0308 03:41:05.538454 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8n5sd" podStartSLOduration=2.538449527 podStartE2EDuration="2.538449527s" podCreationTimestamp="2026-03-08 03:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:05.512988945 +0000 UTC m=+1667.591756172" watchObservedRunningTime="2026-03-08 03:41:05.538449527 +0000 UTC m=+1667.617216744" Mar 08 03:41:06.130617 master-0 kubenswrapper[13046]: I0308 03:41:06.130559 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb" path="/var/lib/kubelet/pods/85c353b3-9fe6-4d8f-baa4-6ba61f21b2bb/volumes" Mar 08 03:41:06.506828 master-0 kubenswrapper[13046]: I0308 03:41:06.506638 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerStarted","Data":"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5"} Mar 08 03:41:06.506828 master-0 kubenswrapper[13046]: I0308 03:41:06.506717 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerStarted","Data":"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f"} Mar 08 03:41:06.563123 master-0 kubenswrapper[13046]: I0308 03:41:06.562918 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.562837496 podStartE2EDuration="2.562837496s" podCreationTimestamp="2026-03-08 03:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:06.536278374 +0000 UTC m=+1668.615045591" watchObservedRunningTime="2026-03-08 03:41:06.562837496 +0000 UTC m=+1668.641604763" Mar 08 03:41:08.107000 master-0 kubenswrapper[13046]: I0308 03:41:08.106927 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74d79d4489-nq9k9" Mar 08 03:41:08.227943 master-0 kubenswrapper[13046]: I0308 03:41:08.227502 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:41:08.227943 master-0 kubenswrapper[13046]: I0308 03:41:08.227754 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="dnsmasq-dns" containerID="cri-o://09646a1f2a1f1f151e481f09da6d16e51e3e13038d73293586d40132f11078b6" gracePeriod=10 Mar 08 03:41:08.528862 master-0 kubenswrapper[13046]: I0308 03:41:08.528636 13046 generic.go:334] "Generic (PLEG): container finished" podID="65bdf11b-0b56-43f0-8eb7-e15efda5ae67" containerID="297f43b318034f13eb7bc5d70e2e65b057d40949d2c0f3a73b72abdc005a922b" exitCode=0 Mar 08 03:41:08.528862 master-0 kubenswrapper[13046]: I0308 03:41:08.528710 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-jgj9r" event={"ID":"65bdf11b-0b56-43f0-8eb7-e15efda5ae67","Type":"ContainerDied","Data":"297f43b318034f13eb7bc5d70e2e65b057d40949d2c0f3a73b72abdc005a922b"} Mar 08 03:41:08.551163 master-0 kubenswrapper[13046]: I0308 03:41:08.551099 13046 generic.go:334] "Generic (PLEG): container finished" podID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerID="09646a1f2a1f1f151e481f09da6d16e51e3e13038d73293586d40132f11078b6" exitCode=0 Mar 08 03:41:08.551346 master-0 kubenswrapper[13046]: I0308 03:41:08.551156 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" event={"ID":"15ec3f9b-6617-4580-8758-0bfd755ff867","Type":"ContainerDied","Data":"09646a1f2a1f1f151e481f09da6d16e51e3e13038d73293586d40132f11078b6"} Mar 08 03:41:08.814657 master-0 kubenswrapper[13046]: I0308 03:41:08.814511 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:41:08.986727 master-0 kubenswrapper[13046]: I0308 03:41:08.986676 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.987041 master-0 kubenswrapper[13046]: I0308 03:41:08.986790 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.987041 master-0 kubenswrapper[13046]: I0308 03:41:08.986821 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4pqg\" (UniqueName: \"kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.987041 master-0 kubenswrapper[13046]: I0308 03:41:08.986859 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.987368 master-0 kubenswrapper[13046]: I0308 03:41:08.986917 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.987437 master-0 kubenswrapper[13046]: I0308 03:41:08.987418 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb\") pod \"15ec3f9b-6617-4580-8758-0bfd755ff867\" (UID: \"15ec3f9b-6617-4580-8758-0bfd755ff867\") " Mar 08 03:41:08.991201 master-0 kubenswrapper[13046]: I0308 03:41:08.991144 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg" (OuterVolumeSpecName: "kube-api-access-f4pqg") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "kube-api-access-f4pqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:09.063606 master-0 kubenswrapper[13046]: I0308 03:41:09.062689 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config" (OuterVolumeSpecName: "config") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:41:09.067569 master-0 kubenswrapper[13046]: I0308 03:41:09.067535 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:41:09.072160 master-0 kubenswrapper[13046]: I0308 03:41:09.072125 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:41:09.073853 master-0 kubenswrapper[13046]: I0308 03:41:09.073171 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:41:09.088083 master-0 kubenswrapper[13046]: I0308 03:41:09.088014 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15ec3f9b-6617-4580-8758-0bfd755ff867" (UID: "15ec3f9b-6617-4580-8758-0bfd755ff867"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:41:09.090649 master-0 kubenswrapper[13046]: I0308 03:41:09.090603 13046 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.090649 master-0 kubenswrapper[13046]: I0308 03:41:09.090646 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4pqg\" (UniqueName: \"kubernetes.io/projected/15ec3f9b-6617-4580-8758-0bfd755ff867-kube-api-access-f4pqg\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.090753 master-0 kubenswrapper[13046]: I0308 03:41:09.090659 13046 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.090753 master-0 kubenswrapper[13046]: I0308 03:41:09.090672 13046 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.090753 master-0 kubenswrapper[13046]: I0308 03:41:09.090684 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.090753 master-0 kubenswrapper[13046]: I0308 03:41:09.090695 13046 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15ec3f9b-6617-4580-8758-0bfd755ff867-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:09.571581 master-0 kubenswrapper[13046]: I0308 03:41:09.571512 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" event={"ID":"15ec3f9b-6617-4580-8758-0bfd755ff867","Type":"ContainerDied","Data":"90042b7fffafbb65106289721a85333ba5f7cea3fb2ef8fddf82f026e87dd2f2"} Mar 08 03:41:09.571581 master-0 kubenswrapper[13046]: I0308 03:41:09.571565 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cbb5d5bf-qr52m" Mar 08 03:41:09.572168 master-0 kubenswrapper[13046]: I0308 03:41:09.571608 13046 scope.go:117] "RemoveContainer" containerID="09646a1f2a1f1f151e481f09da6d16e51e3e13038d73293586d40132f11078b6" Mar 08 03:41:09.650016 master-0 kubenswrapper[13046]: I0308 03:41:09.649973 13046 scope.go:117] "RemoveContainer" containerID="b89608461ea12d4cd9e07bebc2bf6d9eb64ab32e8bc3527426d8a8cddbdda400" Mar 08 03:41:09.671621 master-0 kubenswrapper[13046]: I0308 03:41:09.671568 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:41:09.716900 master-0 kubenswrapper[13046]: I0308 03:41:09.716845 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57cbb5d5bf-qr52m"] Mar 08 03:41:10.037676 master-0 kubenswrapper[13046]: I0308 03:41:10.037629 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:10.118275 master-0 kubenswrapper[13046]: I0308 03:41:10.117592 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle\") pod \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " Mar 08 03:41:10.118275 master-0 kubenswrapper[13046]: I0308 03:41:10.117734 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data\") pod \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " Mar 08 03:41:10.118275 master-0 kubenswrapper[13046]: I0308 03:41:10.117881 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts\") pod \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " Mar 08 03:41:10.118275 master-0 kubenswrapper[13046]: I0308 03:41:10.118055 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55p2g\" (UniqueName: \"kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g\") pod \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\" (UID: \"65bdf11b-0b56-43f0-8eb7-e15efda5ae67\") " Mar 08 03:41:10.124443 master-0 kubenswrapper[13046]: I0308 03:41:10.124393 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts" (OuterVolumeSpecName: "scripts") pod "65bdf11b-0b56-43f0-8eb7-e15efda5ae67" (UID: "65bdf11b-0b56-43f0-8eb7-e15efda5ae67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:10.129516 master-0 kubenswrapper[13046]: I0308 03:41:10.126771 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g" (OuterVolumeSpecName: "kube-api-access-55p2g") pod "65bdf11b-0b56-43f0-8eb7-e15efda5ae67" (UID: "65bdf11b-0b56-43f0-8eb7-e15efda5ae67"). InnerVolumeSpecName "kube-api-access-55p2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:10.173648 master-0 kubenswrapper[13046]: I0308 03:41:10.173575 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" path="/var/lib/kubelet/pods/15ec3f9b-6617-4580-8758-0bfd755ff867/volumes" Mar 08 03:41:10.185170 master-0 kubenswrapper[13046]: I0308 03:41:10.185111 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65bdf11b-0b56-43f0-8eb7-e15efda5ae67" (UID: "65bdf11b-0b56-43f0-8eb7-e15efda5ae67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:10.187188 master-0 kubenswrapper[13046]: I0308 03:41:10.187079 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data" (OuterVolumeSpecName: "config-data") pod "65bdf11b-0b56-43f0-8eb7-e15efda5ae67" (UID: "65bdf11b-0b56-43f0-8eb7-e15efda5ae67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:10.221621 master-0 kubenswrapper[13046]: I0308 03:41:10.221579 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:10.221621 master-0 kubenswrapper[13046]: I0308 03:41:10.221618 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:10.221757 master-0 kubenswrapper[13046]: I0308 03:41:10.221628 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:10.221757 master-0 kubenswrapper[13046]: I0308 03:41:10.221640 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55p2g\" (UniqueName: \"kubernetes.io/projected/65bdf11b-0b56-43f0-8eb7-e15efda5ae67-kube-api-access-55p2g\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:10.592731 master-0 kubenswrapper[13046]: I0308 03:41:10.592624 13046 generic.go:334] "Generic (PLEG): container finished" podID="c73db713-dc45-42da-a26c-a6cf8d83821a" containerID="aec03aeaa079dff2f7d1817269e1f7bd4840803da8e785309968031d87af9708" exitCode=0 Mar 08 03:41:10.592731 master-0 kubenswrapper[13046]: I0308 03:41:10.592691 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8n5sd" event={"ID":"c73db713-dc45-42da-a26c-a6cf8d83821a","Type":"ContainerDied","Data":"aec03aeaa079dff2f7d1817269e1f7bd4840803da8e785309968031d87af9708"} Mar 08 03:41:10.595904 master-0 kubenswrapper[13046]: I0308 03:41:10.595840 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-jgj9r" event={"ID":"65bdf11b-0b56-43f0-8eb7-e15efda5ae67","Type":"ContainerDied","Data":"a953b221132e824fca3c5a13c404ff83ffc7989b9029ec30b1ff5283fb946d43"} Mar 08 03:41:10.596072 master-0 kubenswrapper[13046]: I0308 03:41:10.595911 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a953b221132e824fca3c5a13c404ff83ffc7989b9029ec30b1ff5283fb946d43" Mar 08 03:41:10.596178 master-0 kubenswrapper[13046]: I0308 03:41:10.596054 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-jgj9r" Mar 08 03:41:12.145450 master-0 kubenswrapper[13046]: I0308 03:41:12.145390 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:12.278515 master-0 kubenswrapper[13046]: I0308 03:41:12.278412 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle\") pod \"c73db713-dc45-42da-a26c-a6cf8d83821a\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " Mar 08 03:41:12.278894 master-0 kubenswrapper[13046]: I0308 03:41:12.278857 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data\") pod \"c73db713-dc45-42da-a26c-a6cf8d83821a\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " Mar 08 03:41:12.279265 master-0 kubenswrapper[13046]: I0308 03:41:12.279225 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts\") pod \"c73db713-dc45-42da-a26c-a6cf8d83821a\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " Mar 08 03:41:12.279438 master-0 kubenswrapper[13046]: I0308 03:41:12.279404 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfkqr\" (UniqueName: \"kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr\") pod \"c73db713-dc45-42da-a26c-a6cf8d83821a\" (UID: \"c73db713-dc45-42da-a26c-a6cf8d83821a\") " Mar 08 03:41:12.282673 master-0 kubenswrapper[13046]: I0308 03:41:12.282536 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr" (OuterVolumeSpecName: "kube-api-access-zfkqr") pod "c73db713-dc45-42da-a26c-a6cf8d83821a" (UID: "c73db713-dc45-42da-a26c-a6cf8d83821a"). InnerVolumeSpecName "kube-api-access-zfkqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:12.283551 master-0 kubenswrapper[13046]: I0308 03:41:12.283346 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts" (OuterVolumeSpecName: "scripts") pod "c73db713-dc45-42da-a26c-a6cf8d83821a" (UID: "c73db713-dc45-42da-a26c-a6cf8d83821a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:12.305854 master-0 kubenswrapper[13046]: I0308 03:41:12.305801 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data" (OuterVolumeSpecName: "config-data") pod "c73db713-dc45-42da-a26c-a6cf8d83821a" (UID: "c73db713-dc45-42da-a26c-a6cf8d83821a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:12.326621 master-0 kubenswrapper[13046]: I0308 03:41:12.326535 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c73db713-dc45-42da-a26c-a6cf8d83821a" (UID: "c73db713-dc45-42da-a26c-a6cf8d83821a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:12.385468 master-0 kubenswrapper[13046]: I0308 03:41:12.385392 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfkqr\" (UniqueName: \"kubernetes.io/projected/c73db713-dc45-42da-a26c-a6cf8d83821a-kube-api-access-zfkqr\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:12.385468 master-0 kubenswrapper[13046]: I0308 03:41:12.385463 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:12.385680 master-0 kubenswrapper[13046]: I0308 03:41:12.385512 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:12.385680 master-0 kubenswrapper[13046]: I0308 03:41:12.385532 13046 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73db713-dc45-42da-a26c-a6cf8d83821a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:12.628951 master-0 kubenswrapper[13046]: I0308 03:41:12.628885 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8n5sd" event={"ID":"c73db713-dc45-42da-a26c-a6cf8d83821a","Type":"ContainerDied","Data":"744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8"} Mar 08 03:41:12.628951 master-0 kubenswrapper[13046]: I0308 03:41:12.628939 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="744ccac6b521baaddabc71c2a1d33332d5f0bc78e1e335e6c10ef53d6bf2f0b8" Mar 08 03:41:12.629227 master-0 kubenswrapper[13046]: I0308 03:41:12.628997 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8n5sd" Mar 08 03:41:12.815800 master-0 kubenswrapper[13046]: I0308 03:41:12.815715 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:12.816062 master-0 kubenswrapper[13046]: I0308 03:41:12.815991 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-log" containerID="cri-o://341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" gracePeriod=30 Mar 08 03:41:12.816355 master-0 kubenswrapper[13046]: I0308 03:41:12.816186 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-api" containerID="cri-o://dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" gracePeriod=30 Mar 08 03:41:12.882584 master-0 kubenswrapper[13046]: I0308 03:41:12.882038 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:12.882584 master-0 kubenswrapper[13046]: I0308 03:41:12.882297 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerName="nova-scheduler-scheduler" containerID="cri-o://4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" gracePeriod=30 Mar 08 03:41:12.934914 master-0 kubenswrapper[13046]: I0308 03:41:12.934855 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:12.935143 master-0 kubenswrapper[13046]: I0308 03:41:12.935087 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" containerID="cri-o://f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b" gracePeriod=30 Mar 08 03:41:12.935573 master-0 kubenswrapper[13046]: I0308 03:41:12.935544 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" containerID="cri-o://3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a" gracePeriod=30 Mar 08 03:41:13.493622 master-0 kubenswrapper[13046]: I0308 03:41:13.493589 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:13.620760 master-0 kubenswrapper[13046]: I0308 03:41:13.620617 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.621177 master-0 kubenswrapper[13046]: I0308 03:41:13.621017 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs" (OuterVolumeSpecName: "logs") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:41:13.621575 master-0 kubenswrapper[13046]: I0308 03:41:13.621545 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.621669 master-0 kubenswrapper[13046]: I0308 03:41:13.621595 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knmdd\" (UniqueName: \"kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.621756 master-0 kubenswrapper[13046]: I0308 03:41:13.621725 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.621827 master-0 kubenswrapper[13046]: I0308 03:41:13.621818 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.622019 master-0 kubenswrapper[13046]: I0308 03:41:13.621981 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle\") pod \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\" (UID: \"25a8dba9-1bc5-4002-9dea-7cc4eff36e16\") " Mar 08 03:41:13.623580 master-0 kubenswrapper[13046]: I0308 03:41:13.623549 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.624769 master-0 kubenswrapper[13046]: I0308 03:41:13.624721 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd" (OuterVolumeSpecName: "kube-api-access-knmdd") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "kube-api-access-knmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:13.654535 master-0 kubenswrapper[13046]: I0308 03:41:13.654468 13046 generic.go:334] "Generic (PLEG): container finished" podID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerID="f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b" exitCode=143 Mar 08 03:41:13.654751 master-0 kubenswrapper[13046]: I0308 03:41:13.654544 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerDied","Data":"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b"} Mar 08 03:41:13.657871 master-0 kubenswrapper[13046]: I0308 03:41:13.657828 13046 generic.go:334] "Generic (PLEG): container finished" podID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerID="dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" exitCode=0 Mar 08 03:41:13.657871 master-0 kubenswrapper[13046]: I0308 03:41:13.657847 13046 generic.go:334] "Generic (PLEG): container finished" podID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerID="341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" exitCode=143 Mar 08 03:41:13.657871 master-0 kubenswrapper[13046]: I0308 03:41:13.657870 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerDied","Data":"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5"} Mar 08 03:41:13.658067 master-0 kubenswrapper[13046]: I0308 03:41:13.657902 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerDied","Data":"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f"} Mar 08 03:41:13.658067 master-0 kubenswrapper[13046]: I0308 03:41:13.657914 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"25a8dba9-1bc5-4002-9dea-7cc4eff36e16","Type":"ContainerDied","Data":"766d5402938fb6b33768cd5d986284408257d12587afa8b9871d8b48a3ec8210"} Mar 08 03:41:13.658067 master-0 kubenswrapper[13046]: I0308 03:41:13.657923 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:13.658188 master-0 kubenswrapper[13046]: I0308 03:41:13.657929 13046 scope.go:117] "RemoveContainer" containerID="dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" Mar 08 03:41:13.686185 master-0 kubenswrapper[13046]: I0308 03:41:13.684920 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data" (OuterVolumeSpecName: "config-data") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:13.686185 master-0 kubenswrapper[13046]: I0308 03:41:13.685322 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:13.712141 master-0 kubenswrapper[13046]: I0308 03:41:13.712025 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:13.721207 master-0 kubenswrapper[13046]: I0308 03:41:13.721153 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "25a8dba9-1bc5-4002-9dea-7cc4eff36e16" (UID: "25a8dba9-1bc5-4002-9dea-7cc4eff36e16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:13.726935 master-0 kubenswrapper[13046]: I0308 03:41:13.726870 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knmdd\" (UniqueName: \"kubernetes.io/projected/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-kube-api-access-knmdd\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.726935 master-0 kubenswrapper[13046]: I0308 03:41:13.726937 13046 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.727207 master-0 kubenswrapper[13046]: I0308 03:41:13.726957 13046 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.727207 master-0 kubenswrapper[13046]: I0308 03:41:13.726976 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.727207 master-0 kubenswrapper[13046]: I0308 03:41:13.726997 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a8dba9-1bc5-4002-9dea-7cc4eff36e16-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:13.751385 master-0 kubenswrapper[13046]: I0308 03:41:13.751074 13046 scope.go:117] "RemoveContainer" containerID="341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" Mar 08 03:41:13.779761 master-0 kubenswrapper[13046]: I0308 03:41:13.779710 13046 scope.go:117] "RemoveContainer" containerID="dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" Mar 08 03:41:13.780145 master-0 kubenswrapper[13046]: E0308 03:41:13.780113 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5\": container with ID starting with dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5 not found: ID does not exist" containerID="dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" Mar 08 03:41:13.780227 master-0 kubenswrapper[13046]: I0308 03:41:13.780143 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5"} err="failed to get container status \"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5\": rpc error: code = NotFound desc = could not find container \"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5\": container with ID starting with dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5 not found: ID does not exist" Mar 08 03:41:13.780227 master-0 kubenswrapper[13046]: I0308 03:41:13.780163 13046 scope.go:117] "RemoveContainer" containerID="341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" Mar 08 03:41:13.780667 master-0 kubenswrapper[13046]: E0308 03:41:13.780590 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f\": container with ID starting with 341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f not found: ID does not exist" containerID="341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" Mar 08 03:41:13.780667 master-0 kubenswrapper[13046]: I0308 03:41:13.780639 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f"} err="failed to get container status \"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f\": rpc error: code = NotFound desc = could not find container \"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f\": container with ID starting with 341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f not found: ID does not exist" Mar 08 03:41:13.780768 master-0 kubenswrapper[13046]: I0308 03:41:13.780671 13046 scope.go:117] "RemoveContainer" containerID="dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5" Mar 08 03:41:13.781406 master-0 kubenswrapper[13046]: I0308 03:41:13.781023 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5"} err="failed to get container status \"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5\": rpc error: code = NotFound desc = could not find container \"dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5\": container with ID starting with dab336ecc93b70f2df39f339b71cd9c9f155de292b724a0aa51fead225facaf5 not found: ID does not exist" Mar 08 03:41:13.781406 master-0 kubenswrapper[13046]: I0308 03:41:13.781087 13046 scope.go:117] "RemoveContainer" containerID="341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f" Mar 08 03:41:13.781714 master-0 kubenswrapper[13046]: I0308 03:41:13.781686 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f"} err="failed to get container status \"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f\": rpc error: code = NotFound desc = could not find container \"341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f\": container with ID starting with 341c1fca25a6472c9695a9ead974a2b5dba831a03eb3e4bf681541084117f64f not found: ID does not exist" Mar 08 03:41:14.011516 master-0 kubenswrapper[13046]: I0308 03:41:14.008130 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:14.024681 master-0 kubenswrapper[13046]: I0308 03:41:14.024609 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:14.039560 master-0 kubenswrapper[13046]: I0308 03:41:14.038839 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039572 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-log" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039592 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-log" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039607 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="init" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039615 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="init" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039643 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-api" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039651 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-api" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039660 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bdf11b-0b56-43f0-8eb7-e15efda5ae67" containerName="nova-manage" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039667 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bdf11b-0b56-43f0-8eb7-e15efda5ae67" containerName="nova-manage" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039689 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="dnsmasq-dns" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039697 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="dnsmasq-dns" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: E0308 03:41:14.039721 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73db713-dc45-42da-a26c-a6cf8d83821a" containerName="nova-manage" Mar 08 03:41:14.039756 master-0 kubenswrapper[13046]: I0308 03:41:14.039729 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73db713-dc45-42da-a26c-a6cf8d83821a" containerName="nova-manage" Mar 08 03:41:14.040102 master-0 kubenswrapper[13046]: I0308 03:41:14.039970 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73db713-dc45-42da-a26c-a6cf8d83821a" containerName="nova-manage" Mar 08 03:41:14.040102 master-0 kubenswrapper[13046]: I0308 03:41:14.040017 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-log" Mar 08 03:41:14.040102 master-0 kubenswrapper[13046]: I0308 03:41:14.040030 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ec3f9b-6617-4580-8758-0bfd755ff867" containerName="dnsmasq-dns" Mar 08 03:41:14.040102 master-0 kubenswrapper[13046]: I0308 03:41:14.040061 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" containerName="nova-api-api" Mar 08 03:41:14.040102 master-0 kubenswrapper[13046]: I0308 03:41:14.040071 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bdf11b-0b56-43f0-8eb7-e15efda5ae67" containerName="nova-manage" Mar 08 03:41:14.044644 master-0 kubenswrapper[13046]: I0308 03:41:14.041658 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:14.054512 master-0 kubenswrapper[13046]: I0308 03:41:14.051003 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 03:41:14.054512 master-0 kubenswrapper[13046]: I0308 03:41:14.051193 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 03:41:14.054512 master-0 kubenswrapper[13046]: I0308 03:41:14.051329 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 03:41:14.065118 master-0 kubenswrapper[13046]: I0308 03:41:14.065075 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:14.136863 master-0 kubenswrapper[13046]: I0308 03:41:14.136827 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a8dba9-1bc5-4002-9dea-7cc4eff36e16" path="/var/lib/kubelet/pods/25a8dba9-1bc5-4002-9dea-7cc4eff36e16/volumes" Mar 08 03:41:14.139855 master-0 kubenswrapper[13046]: I0308 03:41:14.139831 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.140066 master-0 kubenswrapper[13046]: I0308 03:41:14.140049 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.140189 master-0 kubenswrapper[13046]: I0308 03:41:14.140173 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lhk\" (UniqueName: \"kubernetes.io/projected/3d57373c-2f48-40da-baa7-611702e9ace5-kube-api-access-z4lhk\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.140498 master-0 kubenswrapper[13046]: I0308 03:41:14.140469 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.140623 master-0 kubenswrapper[13046]: I0308 03:41:14.140609 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-config-data\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.140803 master-0 kubenswrapper[13046]: I0308 03:41:14.140763 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d57373c-2f48-40da-baa7-611702e9ace5-logs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.243643 master-0 kubenswrapper[13046]: I0308 03:41:14.243429 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.243643 master-0 kubenswrapper[13046]: I0308 03:41:14.243543 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-config-data\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.243643 master-0 kubenswrapper[13046]: I0308 03:41:14.243614 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d57373c-2f48-40da-baa7-611702e9ace5-logs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.244140 master-0 kubenswrapper[13046]: I0308 03:41:14.243830 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.244140 master-0 kubenswrapper[13046]: I0308 03:41:14.243932 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.244140 master-0 kubenswrapper[13046]: I0308 03:41:14.243966 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lhk\" (UniqueName: \"kubernetes.io/projected/3d57373c-2f48-40da-baa7-611702e9ace5-kube-api-access-z4lhk\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.244421 master-0 kubenswrapper[13046]: I0308 03:41:14.244258 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d57373c-2f48-40da-baa7-611702e9ace5-logs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.247182 master-0 kubenswrapper[13046]: I0308 03:41:14.247118 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-config-data\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.247550 master-0 kubenswrapper[13046]: I0308 03:41:14.247508 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.252366 master-0 kubenswrapper[13046]: I0308 03:41:14.252324 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.260711 master-0 kubenswrapper[13046]: I0308 03:41:14.260670 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lhk\" (UniqueName: \"kubernetes.io/projected/3d57373c-2f48-40da-baa7-611702e9ace5-kube-api-access-z4lhk\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.316537 master-0 kubenswrapper[13046]: I0308 03:41:14.316460 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d57373c-2f48-40da-baa7-611702e9ace5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d57373c-2f48-40da-baa7-611702e9ace5\") " pod="openstack/nova-api-0" Mar 08 03:41:14.358050 master-0 kubenswrapper[13046]: E0308 03:41:14.357937 13046 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 03:41:14.360138 master-0 kubenswrapper[13046]: E0308 03:41:14.360023 13046 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 03:41:14.362603 master-0 kubenswrapper[13046]: E0308 03:41:14.361936 13046 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 03:41:14.362603 master-0 kubenswrapper[13046]: E0308 03:41:14.361988 13046 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerName="nova-scheduler-scheduler" Mar 08 03:41:14.442808 master-0 kubenswrapper[13046]: I0308 03:41:14.442742 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 03:41:14.978581 master-0 kubenswrapper[13046]: I0308 03:41:14.978461 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 03:41:14.980334 master-0 kubenswrapper[13046]: W0308 03:41:14.978778 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d57373c_2f48_40da_baa7_611702e9ace5.slice/crio-9d241f7b1533dfc588457b78f56aa007d02715ce10aa6cf42540ea3a4ffb2eac WatchSource:0}: Error finding container 9d241f7b1533dfc588457b78f56aa007d02715ce10aa6cf42540ea3a4ffb2eac: Status 404 returned error can't find the container with id 9d241f7b1533dfc588457b78f56aa007d02715ce10aa6cf42540ea3a4ffb2eac Mar 08 03:41:15.720422 master-0 kubenswrapper[13046]: I0308 03:41:15.720345 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d57373c-2f48-40da-baa7-611702e9ace5","Type":"ContainerStarted","Data":"bd3ee428c44cf5b0cc79d57f33d01a41818add08c30302dfb5be3c4933cba618"} Mar 08 03:41:15.720422 master-0 kubenswrapper[13046]: I0308 03:41:15.720412 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d57373c-2f48-40da-baa7-611702e9ace5","Type":"ContainerStarted","Data":"14b85cd73a77763ea690a82a44bd9a12898b275d6b03dc601b6336bb07208abc"} Mar 08 03:41:15.720422 master-0 kubenswrapper[13046]: I0308 03:41:15.720425 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d57373c-2f48-40da-baa7-611702e9ace5","Type":"ContainerStarted","Data":"9d241f7b1533dfc588457b78f56aa007d02715ce10aa6cf42540ea3a4ffb2eac"} Mar 08 03:41:15.756040 master-0 kubenswrapper[13046]: I0308 03:41:15.752392 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.752372435 podStartE2EDuration="2.752372435s" podCreationTimestamp="2026-03-08 03:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:15.748551787 +0000 UTC m=+1677.827319014" watchObservedRunningTime="2026-03-08 03:41:15.752372435 +0000 UTC m=+1677.831139642" Mar 08 03:41:16.076241 master-0 kubenswrapper[13046]: I0308 03:41:16.076080 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": read tcp 10.128.0.2:49930->10.128.1.10:8775: read: connection reset by peer" Mar 08 03:41:16.076819 master-0 kubenswrapper[13046]: I0308 03:41:16.076598 13046 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.10:8775/\": read tcp 10.128.0.2:49946->10.128.1.10:8775: read: connection reset by peer" Mar 08 03:41:16.668391 master-0 kubenswrapper[13046]: I0308 03:41:16.668327 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:41:16.731926 master-0 kubenswrapper[13046]: I0308 03:41:16.722069 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs\") pod \"761f3f7f-0243-43bb-a6c7-a702fc601758\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " Mar 08 03:41:16.731926 master-0 kubenswrapper[13046]: I0308 03:41:16.722261 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle\") pod \"761f3f7f-0243-43bb-a6c7-a702fc601758\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " Mar 08 03:41:16.731926 master-0 kubenswrapper[13046]: I0308 03:41:16.722338 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9ftl\" (UniqueName: \"kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl\") pod \"761f3f7f-0243-43bb-a6c7-a702fc601758\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " Mar 08 03:41:16.731926 master-0 kubenswrapper[13046]: I0308 03:41:16.722758 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data\") pod \"761f3f7f-0243-43bb-a6c7-a702fc601758\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " Mar 08 03:41:16.731926 master-0 kubenswrapper[13046]: I0308 03:41:16.723008 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs\") pod \"761f3f7f-0243-43bb-a6c7-a702fc601758\" (UID: \"761f3f7f-0243-43bb-a6c7-a702fc601758\") " Mar 08 03:41:16.733426 master-0 kubenswrapper[13046]: I0308 03:41:16.733362 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs" (OuterVolumeSpecName: "logs") pod "761f3f7f-0243-43bb-a6c7-a702fc601758" (UID: "761f3f7f-0243-43bb-a6c7-a702fc601758"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 03:41:16.750068 master-0 kubenswrapper[13046]: I0308 03:41:16.749999 13046 generic.go:334] "Generic (PLEG): container finished" podID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerID="3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a" exitCode=0 Mar 08 03:41:16.751185 master-0 kubenswrapper[13046]: I0308 03:41:16.751145 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:41:16.751802 master-0 kubenswrapper[13046]: I0308 03:41:16.751760 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerDied","Data":"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a"} Mar 08 03:41:16.751802 master-0 kubenswrapper[13046]: I0308 03:41:16.751796 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"761f3f7f-0243-43bb-a6c7-a702fc601758","Type":"ContainerDied","Data":"db6a158a7eb9b6bf4968601d869ef4756e0040cf75c5b6afc2f678ae9ba76dea"} Mar 08 03:41:16.751975 master-0 kubenswrapper[13046]: I0308 03:41:16.751813 13046 scope.go:117] "RemoveContainer" containerID="3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a" Mar 08 03:41:16.768059 master-0 kubenswrapper[13046]: I0308 03:41:16.767986 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl" (OuterVolumeSpecName: "kube-api-access-v9ftl") pod "761f3f7f-0243-43bb-a6c7-a702fc601758" (UID: "761f3f7f-0243-43bb-a6c7-a702fc601758"). InnerVolumeSpecName "kube-api-access-v9ftl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:16.780732 master-0 kubenswrapper[13046]: I0308 03:41:16.780674 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data" (OuterVolumeSpecName: "config-data") pod "761f3f7f-0243-43bb-a6c7-a702fc601758" (UID: "761f3f7f-0243-43bb-a6c7-a702fc601758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:16.799571 master-0 kubenswrapper[13046]: I0308 03:41:16.799519 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761f3f7f-0243-43bb-a6c7-a702fc601758" (UID: "761f3f7f-0243-43bb-a6c7-a702fc601758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:16.805065 master-0 kubenswrapper[13046]: I0308 03:41:16.804997 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "761f3f7f-0243-43bb-a6c7-a702fc601758" (UID: "761f3f7f-0243-43bb-a6c7-a702fc601758"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:16.828478 master-0 kubenswrapper[13046]: I0308 03:41:16.826937 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:16.828478 master-0 kubenswrapper[13046]: I0308 03:41:16.826985 13046 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:16.828478 master-0 kubenswrapper[13046]: I0308 03:41:16.827008 13046 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761f3f7f-0243-43bb-a6c7-a702fc601758-logs\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:16.828478 master-0 kubenswrapper[13046]: I0308 03:41:16.827028 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761f3f7f-0243-43bb-a6c7-a702fc601758-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:16.828478 master-0 kubenswrapper[13046]: I0308 03:41:16.827048 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9ftl\" (UniqueName: \"kubernetes.io/projected/761f3f7f-0243-43bb-a6c7-a702fc601758-kube-api-access-v9ftl\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:16.870753 master-0 kubenswrapper[13046]: I0308 03:41:16.870700 13046 scope.go:117] "RemoveContainer" containerID="f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b" Mar 08 03:41:16.896893 master-0 kubenswrapper[13046]: I0308 03:41:16.896823 13046 scope.go:117] "RemoveContainer" containerID="3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a" Mar 08 03:41:16.897216 master-0 kubenswrapper[13046]: E0308 03:41:16.897171 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a\": container with ID starting with 3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a not found: ID does not exist" containerID="3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a" Mar 08 03:41:16.897216 master-0 kubenswrapper[13046]: I0308 03:41:16.897206 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a"} err="failed to get container status \"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a\": rpc error: code = NotFound desc = could not find container \"3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a\": container with ID starting with 3b87c3db1b764f9e7cab5d1d7b6a245f626c5323a2117d4264330b03e3c34d0a not found: ID does not exist" Mar 08 03:41:16.897410 master-0 kubenswrapper[13046]: I0308 03:41:16.897226 13046 scope.go:117] "RemoveContainer" containerID="f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b" Mar 08 03:41:16.897513 master-0 kubenswrapper[13046]: E0308 03:41:16.897429 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b\": container with ID starting with f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b not found: ID does not exist" containerID="f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b" Mar 08 03:41:16.897513 master-0 kubenswrapper[13046]: I0308 03:41:16.897461 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b"} err="failed to get container status \"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b\": rpc error: code = NotFound desc = could not find container \"f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b\": container with ID starting with f33442a5da95e35c6f8dc85aa98da530fa00c3f2215726eded668a3c20c5752b not found: ID does not exist" Mar 08 03:41:17.102850 master-0 kubenswrapper[13046]: I0308 03:41:17.102777 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:17.129387 master-0 kubenswrapper[13046]: I0308 03:41:17.129305 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:17.143412 master-0 kubenswrapper[13046]: I0308 03:41:17.143359 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:17.144218 master-0 kubenswrapper[13046]: E0308 03:41:17.144189 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" Mar 08 03:41:17.144272 master-0 kubenswrapper[13046]: I0308 03:41:17.144217 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" Mar 08 03:41:17.144272 master-0 kubenswrapper[13046]: E0308 03:41:17.144259 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" Mar 08 03:41:17.144272 master-0 kubenswrapper[13046]: I0308 03:41:17.144268 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" Mar 08 03:41:17.144603 master-0 kubenswrapper[13046]: I0308 03:41:17.144579 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-metadata" Mar 08 03:41:17.144664 master-0 kubenswrapper[13046]: I0308 03:41:17.144629 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" containerName="nova-metadata-log" Mar 08 03:41:17.146160 master-0 kubenswrapper[13046]: I0308 03:41:17.146132 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:41:17.150328 master-0 kubenswrapper[13046]: I0308 03:41:17.148892 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 03:41:17.150328 master-0 kubenswrapper[13046]: I0308 03:41:17.149027 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 03:41:17.169447 master-0 kubenswrapper[13046]: I0308 03:41:17.169391 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:17.235609 master-0 kubenswrapper[13046]: I0308 03:41:17.235566 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.235811 master-0 kubenswrapper[13046]: I0308 03:41:17.235631 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-config-data\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.235985 master-0 kubenswrapper[13046]: I0308 03:41:17.235959 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52318bdd-862f-46ac-af96-9672cf810025-logs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.236079 master-0 kubenswrapper[13046]: I0308 03:41:17.236047 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.236134 master-0 kubenswrapper[13046]: I0308 03:41:17.236092 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp79n\" (UniqueName: \"kubernetes.io/projected/52318bdd-862f-46ac-af96-9672cf810025-kube-api-access-rp79n\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.338398 master-0 kubenswrapper[13046]: I0308 03:41:17.338343 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.338511 master-0 kubenswrapper[13046]: I0308 03:41:17.338401 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-config-data\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.339300 master-0 kubenswrapper[13046]: I0308 03:41:17.339264 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52318bdd-862f-46ac-af96-9672cf810025-logs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.339341 master-0 kubenswrapper[13046]: I0308 03:41:17.339327 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.339375 master-0 kubenswrapper[13046]: I0308 03:41:17.339354 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp79n\" (UniqueName: \"kubernetes.io/projected/52318bdd-862f-46ac-af96-9672cf810025-kube-api-access-rp79n\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.339879 master-0 kubenswrapper[13046]: I0308 03:41:17.339839 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52318bdd-862f-46ac-af96-9672cf810025-logs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.344306 master-0 kubenswrapper[13046]: I0308 03:41:17.344262 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.345428 master-0 kubenswrapper[13046]: I0308 03:41:17.345391 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.349404 master-0 kubenswrapper[13046]: I0308 03:41:17.349356 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52318bdd-862f-46ac-af96-9672cf810025-config-data\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.358364 master-0 kubenswrapper[13046]: I0308 03:41:17.358288 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp79n\" (UniqueName: \"kubernetes.io/projected/52318bdd-862f-46ac-af96-9672cf810025-kube-api-access-rp79n\") pod \"nova-metadata-0\" (UID: \"52318bdd-862f-46ac-af96-9672cf810025\") " pod="openstack/nova-metadata-0" Mar 08 03:41:17.491064 master-0 kubenswrapper[13046]: I0308 03:41:17.490993 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 03:41:18.044505 master-0 kubenswrapper[13046]: I0308 03:41:18.044432 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 03:41:18.156771 master-0 kubenswrapper[13046]: I0308 03:41:18.156713 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761f3f7f-0243-43bb-a6c7-a702fc601758" path="/var/lib/kubelet/pods/761f3f7f-0243-43bb-a6c7-a702fc601758/volumes" Mar 08 03:41:18.432506 master-0 kubenswrapper[13046]: I0308 03:41:18.432447 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:41:18.502621 master-0 kubenswrapper[13046]: I0308 03:41:18.500032 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8zq5\" (UniqueName: \"kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5\") pod \"b624d075-24f1-4e0a-8638-d4a694ab697f\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " Mar 08 03:41:18.502621 master-0 kubenswrapper[13046]: I0308 03:41:18.500293 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle\") pod \"b624d075-24f1-4e0a-8638-d4a694ab697f\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " Mar 08 03:41:18.502621 master-0 kubenswrapper[13046]: I0308 03:41:18.500373 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data\") pod \"b624d075-24f1-4e0a-8638-d4a694ab697f\" (UID: \"b624d075-24f1-4e0a-8638-d4a694ab697f\") " Mar 08 03:41:18.507872 master-0 kubenswrapper[13046]: I0308 03:41:18.507830 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5" (OuterVolumeSpecName: "kube-api-access-x8zq5") pod "b624d075-24f1-4e0a-8638-d4a694ab697f" (UID: "b624d075-24f1-4e0a-8638-d4a694ab697f"). InnerVolumeSpecName "kube-api-access-x8zq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:41:18.541279 master-0 kubenswrapper[13046]: I0308 03:41:18.539256 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b624d075-24f1-4e0a-8638-d4a694ab697f" (UID: "b624d075-24f1-4e0a-8638-d4a694ab697f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:18.569316 master-0 kubenswrapper[13046]: I0308 03:41:18.568898 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data" (OuterVolumeSpecName: "config-data") pod "b624d075-24f1-4e0a-8638-d4a694ab697f" (UID: "b624d075-24f1-4e0a-8638-d4a694ab697f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:41:18.603620 master-0 kubenswrapper[13046]: I0308 03:41:18.603557 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:18.603620 master-0 kubenswrapper[13046]: I0308 03:41:18.603615 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b624d075-24f1-4e0a-8638-d4a694ab697f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:18.603849 master-0 kubenswrapper[13046]: I0308 03:41:18.603637 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8zq5\" (UniqueName: \"kubernetes.io/projected/b624d075-24f1-4e0a-8638-d4a694ab697f-kube-api-access-x8zq5\") on node \"master-0\" DevicePath \"\"" Mar 08 03:41:18.781592 master-0 kubenswrapper[13046]: I0308 03:41:18.781438 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52318bdd-862f-46ac-af96-9672cf810025","Type":"ContainerStarted","Data":"06be5357da39b00c20ea4975ecd24fd0f275f08ba523850f1c166e7d5f3ab91a"} Mar 08 03:41:18.781592 master-0 kubenswrapper[13046]: I0308 03:41:18.781533 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52318bdd-862f-46ac-af96-9672cf810025","Type":"ContainerStarted","Data":"a814aa75dd7b21c982ced7f88ee50a10ea1ab629e0d07e412d1842c0a72cd481"} Mar 08 03:41:18.781592 master-0 kubenswrapper[13046]: I0308 03:41:18.781550 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52318bdd-862f-46ac-af96-9672cf810025","Type":"ContainerStarted","Data":"07ddf64707a4258e31041f666aaaf8fa7fac753c9be4fa2e64ba7a56117adf59"} Mar 08 03:41:18.784898 master-0 kubenswrapper[13046]: I0308 03:41:18.784842 13046 generic.go:334] "Generic (PLEG): container finished" podID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" exitCode=0 Mar 08 03:41:18.784973 master-0 kubenswrapper[13046]: I0308 03:41:18.784913 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b624d075-24f1-4e0a-8638-d4a694ab697f","Type":"ContainerDied","Data":"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e"} Mar 08 03:41:18.784973 master-0 kubenswrapper[13046]: I0308 03:41:18.784955 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b624d075-24f1-4e0a-8638-d4a694ab697f","Type":"ContainerDied","Data":"5556917f01ce675bc3432988d09c68aff2243c0f1ea22d6bea40a4e8fa186f03"} Mar 08 03:41:18.785036 master-0 kubenswrapper[13046]: I0308 03:41:18.784985 13046 scope.go:117] "RemoveContainer" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" Mar 08 03:41:18.785186 master-0 kubenswrapper[13046]: I0308 03:41:18.785151 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:41:18.853838 master-0 kubenswrapper[13046]: I0308 03:41:18.853576 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.853544115 podStartE2EDuration="1.853544115s" podCreationTimestamp="2026-03-08 03:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:18.844766966 +0000 UTC m=+1680.923534183" watchObservedRunningTime="2026-03-08 03:41:18.853544115 +0000 UTC m=+1680.932311332" Mar 08 03:41:18.860683 master-0 kubenswrapper[13046]: I0308 03:41:18.860631 13046 scope.go:117] "RemoveContainer" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" Mar 08 03:41:18.866949 master-0 kubenswrapper[13046]: E0308 03:41:18.866896 13046 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e\": container with ID starting with 4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e not found: ID does not exist" containerID="4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e" Mar 08 03:41:18.867099 master-0 kubenswrapper[13046]: I0308 03:41:18.866949 13046 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e"} err="failed to get container status \"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e\": rpc error: code = NotFound desc = could not find container \"4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e\": container with ID starting with 4353ac5427da336ea869625946fb2b0a57c79a5a9eae5167a6327d7b5ad6ee1e not found: ID does not exist" Mar 08 03:41:18.889934 master-0 kubenswrapper[13046]: I0308 03:41:18.889858 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:18.909931 master-0 kubenswrapper[13046]: I0308 03:41:18.909857 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:18.922993 master-0 kubenswrapper[13046]: I0308 03:41:18.922920 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:18.923504 master-0 kubenswrapper[13046]: E0308 03:41:18.923456 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerName="nova-scheduler-scheduler" Mar 08 03:41:18.923504 master-0 kubenswrapper[13046]: I0308 03:41:18.923477 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerName="nova-scheduler-scheduler" Mar 08 03:41:18.923790 master-0 kubenswrapper[13046]: I0308 03:41:18.923765 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" containerName="nova-scheduler-scheduler" Mar 08 03:41:18.924631 master-0 kubenswrapper[13046]: I0308 03:41:18.924606 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:41:18.927869 master-0 kubenswrapper[13046]: I0308 03:41:18.927821 13046 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 03:41:18.959174 master-0 kubenswrapper[13046]: I0308 03:41:18.955270 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:19.033089 master-0 kubenswrapper[13046]: I0308 03:41:19.032214 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-config-data\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.033089 master-0 kubenswrapper[13046]: I0308 03:41:19.032326 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.033089 master-0 kubenswrapper[13046]: I0308 03:41:19.032833 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/c6334b14-f49e-4a43-867a-2456a72324ab-kube-api-access-dvrfr\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.134729 master-0 kubenswrapper[13046]: I0308 03:41:19.134674 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.134847 master-0 kubenswrapper[13046]: I0308 03:41:19.134733 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-config-data\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.134956 master-0 kubenswrapper[13046]: I0308 03:41:19.134930 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/c6334b14-f49e-4a43-867a-2456a72324ab-kube-api-access-dvrfr\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.138649 master-0 kubenswrapper[13046]: I0308 03:41:19.138610 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.141662 master-0 kubenswrapper[13046]: I0308 03:41:19.141614 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6334b14-f49e-4a43-867a-2456a72324ab-config-data\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.157141 master-0 kubenswrapper[13046]: I0308 03:41:19.157063 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrfr\" (UniqueName: \"kubernetes.io/projected/c6334b14-f49e-4a43-867a-2456a72324ab-kube-api-access-dvrfr\") pod \"nova-scheduler-0\" (UID: \"c6334b14-f49e-4a43-867a-2456a72324ab\") " pod="openstack/nova-scheduler-0" Mar 08 03:41:19.257859 master-0 kubenswrapper[13046]: I0308 03:41:19.257786 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 03:41:19.787869 master-0 kubenswrapper[13046]: I0308 03:41:19.787801 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 03:41:19.791861 master-0 kubenswrapper[13046]: W0308 03:41:19.791825 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6334b14_f49e_4a43_867a_2456a72324ab.slice/crio-ccbbc2dea177d18175205175fa6b077828cd470400319935bfd9d1c3289dae06 WatchSource:0}: Error finding container ccbbc2dea177d18175205175fa6b077828cd470400319935bfd9d1c3289dae06: Status 404 returned error can't find the container with id ccbbc2dea177d18175205175fa6b077828cd470400319935bfd9d1c3289dae06 Mar 08 03:41:20.140842 master-0 kubenswrapper[13046]: I0308 03:41:20.140774 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b624d075-24f1-4e0a-8638-d4a694ab697f" path="/var/lib/kubelet/pods/b624d075-24f1-4e0a-8638-d4a694ab697f/volumes" Mar 08 03:41:20.825279 master-0 kubenswrapper[13046]: I0308 03:41:20.825200 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6334b14-f49e-4a43-867a-2456a72324ab","Type":"ContainerStarted","Data":"17d4e5c6a68fa8111fffdfb55621bb24f42392ce23e79c3f259a5df95771c181"} Mar 08 03:41:20.825279 master-0 kubenswrapper[13046]: I0308 03:41:20.825266 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c6334b14-f49e-4a43-867a-2456a72324ab","Type":"ContainerStarted","Data":"ccbbc2dea177d18175205175fa6b077828cd470400319935bfd9d1c3289dae06"} Mar 08 03:41:20.852426 master-0 kubenswrapper[13046]: I0308 03:41:20.852348 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.852329357 podStartE2EDuration="2.852329357s" podCreationTimestamp="2026-03-08 03:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:41:20.849169997 +0000 UTC m=+1682.927937224" watchObservedRunningTime="2026-03-08 03:41:20.852329357 +0000 UTC m=+1682.931096574" Mar 08 03:41:22.491208 master-0 kubenswrapper[13046]: I0308 03:41:22.491127 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:41:22.491943 master-0 kubenswrapper[13046]: I0308 03:41:22.491435 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 03:41:24.259099 master-0 kubenswrapper[13046]: I0308 03:41:24.258975 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 03:41:24.444111 master-0 kubenswrapper[13046]: I0308 03:41:24.444029 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:41:24.444319 master-0 kubenswrapper[13046]: I0308 03:41:24.444127 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 03:41:25.491755 master-0 kubenswrapper[13046]: I0308 03:41:25.491660 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d57373c-2f48-40da-baa7-611702e9ace5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:41:25.492505 master-0 kubenswrapper[13046]: I0308 03:41:25.491749 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d57373c-2f48-40da-baa7-611702e9ace5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:41:27.491547 master-0 kubenswrapper[13046]: I0308 03:41:27.491464 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 03:41:27.492243 master-0 kubenswrapper[13046]: I0308 03:41:27.491860 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 03:41:28.513760 master-0 kubenswrapper[13046]: I0308 03:41:28.513665 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="52318bdd-862f-46ac-af96-9672cf810025" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:41:28.514699 master-0 kubenswrapper[13046]: I0308 03:41:28.513674 13046 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="52318bdd-862f-46ac-af96-9672cf810025" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 03:41:29.258686 master-0 kubenswrapper[13046]: I0308 03:41:29.258612 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 03:41:29.311984 master-0 kubenswrapper[13046]: I0308 03:41:29.311926 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 03:41:29.975221 master-0 kubenswrapper[13046]: I0308 03:41:29.975161 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 03:41:34.456301 master-0 kubenswrapper[13046]: I0308 03:41:34.456260 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 03:41:34.456868 master-0 kubenswrapper[13046]: I0308 03:41:34.456325 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 03:41:34.456926 master-0 kubenswrapper[13046]: I0308 03:41:34.456888 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 03:41:34.456974 master-0 kubenswrapper[13046]: I0308 03:41:34.456937 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 03:41:34.461239 master-0 kubenswrapper[13046]: I0308 03:41:34.461195 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 03:41:34.465330 master-0 kubenswrapper[13046]: I0308 03:41:34.465288 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 03:41:37.496630 master-0 kubenswrapper[13046]: I0308 03:41:37.496549 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 03:41:37.502647 master-0 kubenswrapper[13046]: I0308 03:41:37.502599 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 03:41:37.508987 master-0 kubenswrapper[13046]: I0308 03:41:37.508929 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 03:41:38.071931 master-0 kubenswrapper[13046]: I0308 03:41:38.071893 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 03:42:04.805112 master-0 kubenswrapper[13046]: I0308 03:42:04.798792 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:42:04.805112 master-0 kubenswrapper[13046]: I0308 03:42:04.799074 13046 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" podUID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" containerName="sushy-emulator" containerID="cri-o://a8b4be0d1f53cf129f5d0218c0e9b67e67e55642130383a9a90503d5b655aa27" gracePeriod=30 Mar 08 03:42:05.481563 master-0 kubenswrapper[13046]: I0308 03:42:05.481470 13046 generic.go:334] "Generic (PLEG): container finished" podID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" containerID="a8b4be0d1f53cf129f5d0218c0e9b67e67e55642130383a9a90503d5b655aa27" exitCode=0 Mar 08 03:42:05.481563 master-0 kubenswrapper[13046]: I0308 03:42:05.481514 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" event={"ID":"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8","Type":"ContainerDied","Data":"a8b4be0d1f53cf129f5d0218c0e9b67e67e55642130383a9a90503d5b655aa27"} Mar 08 03:42:05.481828 master-0 kubenswrapper[13046]: I0308 03:42:05.481606 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" event={"ID":"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8","Type":"ContainerDied","Data":"1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a"} Mar 08 03:42:05.481828 master-0 kubenswrapper[13046]: I0308 03:42:05.481634 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ddca93e43838e863d038e1c117d92ed8d3dcd0f58883844e93fe1f1a1eba62a" Mar 08 03:42:05.572710 master-0 kubenswrapper[13046]: I0308 03:42:05.572655 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.718374 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-vqnr7"] Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: E0308 03:42:05.719017 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" containerName="sushy-emulator" Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.719036 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" containerName="sushy-emulator" Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.719132 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config\") pod \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.719240 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x82ft\" (UniqueName: \"kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft\") pod \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.719314 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" containerName="sushy-emulator" Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.719382 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config\") pod \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\" (UID: \"33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8\") " Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.720210 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.726620 master-0 kubenswrapper[13046]: I0308 03:42:05.725585 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" (UID: "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 03:42:05.734007 master-0 kubenswrapper[13046]: I0308 03:42:05.730716 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" (UID: "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 03:42:05.734007 master-0 kubenswrapper[13046]: I0308 03:42:05.731789 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft" (OuterVolumeSpecName: "kube-api-access-x82ft") pod "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" (UID: "33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8"). InnerVolumeSpecName "kube-api-access-x82ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 03:42:05.756520 master-0 kubenswrapper[13046]: I0308 03:42:05.753415 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-vqnr7"] Mar 08 03:42:05.824359 master-0 kubenswrapper[13046]: I0308 03:42:05.824304 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ce341bb3-fbb0-4093-ae47-2f9c218d6250-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.824963 master-0 kubenswrapper[13046]: I0308 03:42:05.824460 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ce341bb3-fbb0-4093-ae47-2f9c218d6250-os-client-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.824963 master-0 kubenswrapper[13046]: I0308 03:42:05.824544 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5bl\" (UniqueName: \"kubernetes.io/projected/ce341bb3-fbb0-4093-ae47-2f9c218d6250-kube-api-access-2c5bl\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.824963 master-0 kubenswrapper[13046]: I0308 03:42:05.824723 13046 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:42:05.824963 master-0 kubenswrapper[13046]: I0308 03:42:05.824738 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x82ft\" (UniqueName: \"kubernetes.io/projected/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-kube-api-access-x82ft\") on node \"master-0\" DevicePath \"\"" Mar 08 03:42:05.824963 master-0 kubenswrapper[13046]: I0308 03:42:05.824748 13046 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 08 03:42:05.926668 master-0 kubenswrapper[13046]: I0308 03:42:05.926602 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5bl\" (UniqueName: \"kubernetes.io/projected/ce341bb3-fbb0-4093-ae47-2f9c218d6250-kube-api-access-2c5bl\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.926865 master-0 kubenswrapper[13046]: I0308 03:42:05.926758 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ce341bb3-fbb0-4093-ae47-2f9c218d6250-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.926865 master-0 kubenswrapper[13046]: I0308 03:42:05.926853 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ce341bb3-fbb0-4093-ae47-2f9c218d6250-os-client-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.927968 master-0 kubenswrapper[13046]: I0308 03:42:05.927928 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ce341bb3-fbb0-4093-ae47-2f9c218d6250-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.931403 master-0 kubenswrapper[13046]: I0308 03:42:05.931351 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ce341bb3-fbb0-4093-ae47-2f9c218d6250-os-client-config\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:05.942925 master-0 kubenswrapper[13046]: I0308 03:42:05.942879 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5bl\" (UniqueName: \"kubernetes.io/projected/ce341bb3-fbb0-4093-ae47-2f9c218d6250-kube-api-access-2c5bl\") pod \"sushy-emulator-84965d5d88-vqnr7\" (UID: \"ce341bb3-fbb0-4093-ae47-2f9c218d6250\") " pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:06.104088 master-0 kubenswrapper[13046]: I0308 03:42:06.104033 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:06.499780 master-0 kubenswrapper[13046]: I0308 03:42:06.499659 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-7ll2s" Mar 08 03:42:06.573787 master-0 kubenswrapper[13046]: I0308 03:42:06.573705 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:42:06.586138 master-0 kubenswrapper[13046]: I0308 03:42:06.584538 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-7ll2s"] Mar 08 03:42:06.700924 master-0 kubenswrapper[13046]: I0308 03:42:06.700859 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-vqnr7"] Mar 08 03:42:07.513372 master-0 kubenswrapper[13046]: I0308 03:42:07.513328 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" event={"ID":"ce341bb3-fbb0-4093-ae47-2f9c218d6250","Type":"ContainerStarted","Data":"7956f0ea638dc50447eeb52b7ba959a0af6fcd9534aeb4fa6f62d5f327243af9"} Mar 08 03:42:07.513998 master-0 kubenswrapper[13046]: I0308 03:42:07.513977 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" event={"ID":"ce341bb3-fbb0-4093-ae47-2f9c218d6250","Type":"ContainerStarted","Data":"9c6175a72c2bfcb027d235d0fc8bc16bfbad095aef524fd2bdc0f861802f8490"} Mar 08 03:42:07.562132 master-0 kubenswrapper[13046]: I0308 03:42:07.562008 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" podStartSLOduration=2.561984034 podStartE2EDuration="2.561984034s" podCreationTimestamp="2026-03-08 03:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 03:42:07.537210582 +0000 UTC m=+1729.615977819" watchObservedRunningTime="2026-03-08 03:42:07.561984034 +0000 UTC m=+1729.640751271" Mar 08 03:42:08.139078 master-0 kubenswrapper[13046]: I0308 03:42:08.138992 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8" path="/var/lib/kubelet/pods/33ecfd98-a9c0-459e-9ba9-b744f9a4b4f8/volumes" Mar 08 03:42:16.105549 master-0 kubenswrapper[13046]: I0308 03:42:16.105332 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:16.105549 master-0 kubenswrapper[13046]: I0308 03:42:16.105466 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:16.144605 master-0 kubenswrapper[13046]: I0308 03:42:16.144550 13046 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:16.658298 master-0 kubenswrapper[13046]: I0308 03:42:16.658241 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-vqnr7" Mar 08 03:42:22.885850 master-0 kubenswrapper[13046]: I0308 03:42:22.885771 13046 scope.go:117] "RemoveContainer" containerID="a8b4be0d1f53cf129f5d0218c0e9b67e67e55642130383a9a90503d5b655aa27" Mar 08 03:43:23.002204 master-0 kubenswrapper[13046]: I0308 03:43:23.002049 13046 scope.go:117] "RemoveContainer" containerID="48cd038ded94fcd5f505dbc354f100a71f9b275b1fe4afa1e68a9b25d76dfa36" Mar 08 03:43:23.823270 master-0 kubenswrapper[13046]: E0308 03:43:23.823223 13046 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.32.10:50000->192.168.32.10:33351: read tcp 192.168.32.10:50000->192.168.32.10:33351: read: connection reset by peer Mar 08 03:44:23.097442 master-0 kubenswrapper[13046]: I0308 03:44:23.097358 13046 scope.go:117] "RemoveContainer" containerID="b1d28aefb2d4d0f07983029a4b00c727be020bd6d0e72d488ab98cca65d32172" Mar 08 03:44:23.149742 master-0 kubenswrapper[13046]: I0308 03:44:23.149572 13046 scope.go:117] "RemoveContainer" containerID="ddf754030224c81da1358e95a2d6672e0d5d7b9ca883aa6e3f0f767ef0c9aa66" Mar 08 03:44:44.383242 master-0 kubenswrapper[13046]: I0308 03:44:44.382367 13046 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-e64dd-backup-0" podUID="100689ad-dc43-494b-a7a2-f0351b969ab7" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.128.0.238:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 03:46:18.714747 master-0 kubenswrapper[13046]: E0308 03:46:18.714614 13046 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff" to get inode usage: stat /var/lib/containers/storage/overlay/7e681ac9e51c47994a4cf9c89c2a77ceb722aa9fed729d0f980153d228a72736/diff: no such file or directory, extraDiskErr: Mar 08 03:47:17.070029 master-0 kubenswrapper[13046]: I0308 03:47:17.069588 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jj9pp"] Mar 08 03:47:17.090744 master-0 kubenswrapper[13046]: I0308 03:47:17.090658 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jj9pp"] Mar 08 03:47:18.046295 master-0 kubenswrapper[13046]: I0308 03:47:18.046220 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kpzhp"] Mar 08 03:47:18.063071 master-0 kubenswrapper[13046]: I0308 03:47:18.062995 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kpzhp"] Mar 08 03:47:18.162758 master-0 kubenswrapper[13046]: I0308 03:47:18.162668 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd2c544-a565-4f70-978b-667ba7c35a57" path="/var/lib/kubelet/pods/2cd2c544-a565-4f70-978b-667ba7c35a57/volumes" Mar 08 03:47:18.165198 master-0 kubenswrapper[13046]: I0308 03:47:18.165161 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0388d7-fdd0-4f0b-9614-8122eb3258ec" path="/var/lib/kubelet/pods/da0388d7-fdd0-4f0b-9614-8122eb3258ec/volumes" Mar 08 03:47:19.089176 master-0 kubenswrapper[13046]: I0308 03:47:19.089105 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xbkt2"] Mar 08 03:47:19.115528 master-0 kubenswrapper[13046]: I0308 03:47:19.114577 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-eb58-account-create-update-l7wpw"] Mar 08 03:47:19.135153 master-0 kubenswrapper[13046]: I0308 03:47:19.135101 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-eb58-account-create-update-l7wpw"] Mar 08 03:47:19.155514 master-0 kubenswrapper[13046]: I0308 03:47:19.153171 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c412-account-create-update-zvl4c"] Mar 08 03:47:19.170555 master-0 kubenswrapper[13046]: I0308 03:47:19.169196 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xbkt2"] Mar 08 03:47:19.190517 master-0 kubenswrapper[13046]: I0308 03:47:19.188131 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-95b6-account-create-update-jlsmg"] Mar 08 03:47:19.205590 master-0 kubenswrapper[13046]: I0308 03:47:19.203714 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c412-account-create-update-zvl4c"] Mar 08 03:47:19.220517 master-0 kubenswrapper[13046]: I0308 03:47:19.218719 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-95b6-account-create-update-jlsmg"] Mar 08 03:47:20.131818 master-0 kubenswrapper[13046]: I0308 03:47:20.131768 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17dd1ec6-c2b0-46a9-b162-6efee4a883b9" path="/var/lib/kubelet/pods/17dd1ec6-c2b0-46a9-b162-6efee4a883b9/volumes" Mar 08 03:47:20.132913 master-0 kubenswrapper[13046]: I0308 03:47:20.132877 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3025d7-d6e2-42c7-8352-ec8199a2e9ee" path="/var/lib/kubelet/pods/3d3025d7-d6e2-42c7-8352-ec8199a2e9ee/volumes" Mar 08 03:47:20.135374 master-0 kubenswrapper[13046]: I0308 03:47:20.135244 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95995527-8076-43fb-8a0e-a9678030ad5e" path="/var/lib/kubelet/pods/95995527-8076-43fb-8a0e-a9678030ad5e/volumes" Mar 08 03:47:20.137218 master-0 kubenswrapper[13046]: I0308 03:47:20.137197 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02bd09c-8e6b-40b9-967d-d93b3621ae5b" path="/var/lib/kubelet/pods/d02bd09c-8e6b-40b9-967d-d93b3621ae5b/volumes" Mar 08 03:47:23.329538 master-0 kubenswrapper[13046]: I0308 03:47:23.329422 13046 scope.go:117] "RemoveContainer" containerID="5654f00d64188293757e1c9ef3cdccdf33780f01db9ecdd3d48ebddd0c502803" Mar 08 03:47:23.360909 master-0 kubenswrapper[13046]: I0308 03:47:23.360855 13046 scope.go:117] "RemoveContainer" containerID="e9b00a230484d6acafff4a84213a2da26448c6cc415e76f80d1e3c74e1038f75" Mar 08 03:47:23.390180 master-0 kubenswrapper[13046]: I0308 03:47:23.390101 13046 scope.go:117] "RemoveContainer" containerID="b3e824a99970d6cdb570b7aa177700f6929d08f94a4e2b673868af8bfb28b0fb" Mar 08 03:47:23.425076 master-0 kubenswrapper[13046]: I0308 03:47:23.424550 13046 scope.go:117] "RemoveContainer" containerID="6f81c42c2ed92c30cdaf63e460e31b7e1bfa1a76b96f7e6f849b345fdb1766cc" Mar 08 03:47:23.452917 master-0 kubenswrapper[13046]: I0308 03:47:23.452862 13046 scope.go:117] "RemoveContainer" containerID="461e31be4361e50fdbfc062802513f65ec890b82ef96ddb4f2e1d4b1a514eed0" Mar 08 03:47:23.488980 master-0 kubenswrapper[13046]: I0308 03:47:23.488375 13046 scope.go:117] "RemoveContainer" containerID="824af6069fd162c64f76b33e7938589cd758c8885c13b07192e47f94f1b439d2" Mar 08 03:47:32.062611 master-0 kubenswrapper[13046]: I0308 03:47:32.062470 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dc6lv"] Mar 08 03:47:32.081280 master-0 kubenswrapper[13046]: I0308 03:47:32.081162 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dc6lv"] Mar 08 03:47:32.139161 master-0 kubenswrapper[13046]: I0308 03:47:32.138960 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1689a9-e497-4ef2-9cf9-1622e16965d1" path="/var/lib/kubelet/pods/3e1689a9-e497-4ef2-9cf9-1622e16965d1/volumes" Mar 08 03:47:47.098299 master-0 kubenswrapper[13046]: I0308 03:47:47.098213 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6blrg"] Mar 08 03:47:47.113049 master-0 kubenswrapper[13046]: I0308 03:47:47.112974 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-cdm9x"] Mar 08 03:47:47.128544 master-0 kubenswrapper[13046]: I0308 03:47:47.128439 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1cc2-account-create-update-7nfl4"] Mar 08 03:47:47.138952 master-0 kubenswrapper[13046]: I0308 03:47:47.138885 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79d6-account-create-update-78z48"] Mar 08 03:47:47.149090 master-0 kubenswrapper[13046]: I0308 03:47:47.149027 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-cdm9x"] Mar 08 03:47:47.158519 master-0 kubenswrapper[13046]: I0308 03:47:47.158302 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6blrg"] Mar 08 03:47:47.168507 master-0 kubenswrapper[13046]: I0308 03:47:47.168441 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79d6-account-create-update-78z48"] Mar 08 03:47:47.178458 master-0 kubenswrapper[13046]: I0308 03:47:47.178408 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1cc2-account-create-update-7nfl4"] Mar 08 03:47:48.133186 master-0 kubenswrapper[13046]: I0308 03:47:48.133118 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7f97b4-52cc-4108-95c4-dc762cd1398a" path="/var/lib/kubelet/pods/6d7f97b4-52cc-4108-95c4-dc762cd1398a/volumes" Mar 08 03:47:48.134043 master-0 kubenswrapper[13046]: I0308 03:47:48.134009 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d02515-fb93-427a-9f9f-d97a1e68ec30" path="/var/lib/kubelet/pods/a8d02515-fb93-427a-9f9f-d97a1e68ec30/volumes" Mar 08 03:47:48.134858 master-0 kubenswrapper[13046]: I0308 03:47:48.134827 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26dfae9-3c54-4102-8762-903c01f9eb23" path="/var/lib/kubelet/pods/d26dfae9-3c54-4102-8762-903c01f9eb23/volumes" Mar 08 03:47:48.135985 master-0 kubenswrapper[13046]: I0308 03:47:48.135951 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5eac99-34cd-4cdf-af11-d7475573518d" path="/var/lib/kubelet/pods/fb5eac99-34cd-4cdf-af11-d7475573518d/volumes" Mar 08 03:47:52.056980 master-0 kubenswrapper[13046]: I0308 03:47:52.056609 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vvfxv"] Mar 08 03:47:52.076795 master-0 kubenswrapper[13046]: I0308 03:47:52.076727 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vvfxv"] Mar 08 03:47:52.137096 master-0 kubenswrapper[13046]: I0308 03:47:52.136958 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ace6ef2-a01f-4585-8282-c24e3d7a8246" path="/var/lib/kubelet/pods/1ace6ef2-a01f-4585-8282-c24e3d7a8246/volumes" Mar 08 03:47:53.108295 master-0 kubenswrapper[13046]: I0308 03:47:53.108253 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2bt4j"] Mar 08 03:47:53.155490 master-0 kubenswrapper[13046]: I0308 03:47:53.155432 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2bt4j"] Mar 08 03:47:54.141507 master-0 kubenswrapper[13046]: I0308 03:47:54.141408 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1f8575-4138-44d5-9be6-14a70bf8170c" path="/var/lib/kubelet/pods/7a1f8575-4138-44d5-9be6-14a70bf8170c/volumes" Mar 08 03:47:59.051308 master-0 kubenswrapper[13046]: I0308 03:47:59.051245 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-b2m2r"] Mar 08 03:47:59.071452 master-0 kubenswrapper[13046]: I0308 03:47:59.071385 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-b2m2r"] Mar 08 03:48:00.038930 master-0 kubenswrapper[13046]: I0308 03:48:00.038837 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7752-account-create-update-qwq4w"] Mar 08 03:48:00.055308 master-0 kubenswrapper[13046]: I0308 03:48:00.048416 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-7752-account-create-update-qwq4w"] Mar 08 03:48:00.132216 master-0 kubenswrapper[13046]: I0308 03:48:00.132129 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf185b5-9d0e-48e6-a961-fba08aa4688a" path="/var/lib/kubelet/pods/3cf185b5-9d0e-48e6-a961-fba08aa4688a/volumes" Mar 08 03:48:00.133855 master-0 kubenswrapper[13046]: I0308 03:48:00.133812 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b0006e-a04e-4dcb-a516-b6d02385a494" path="/var/lib/kubelet/pods/f6b0006e-a04e-4dcb-a516-b6d02385a494/volumes" Mar 08 03:48:20.056127 master-0 kubenswrapper[13046]: I0308 03:48:20.056038 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dzj84"] Mar 08 03:48:20.073338 master-0 kubenswrapper[13046]: I0308 03:48:20.073279 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dzj84"] Mar 08 03:48:20.135150 master-0 kubenswrapper[13046]: I0308 03:48:20.135082 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f71b2d7-115f-473a-9427-8af24a1a7467" path="/var/lib/kubelet/pods/2f71b2d7-115f-473a-9427-8af24a1a7467/volumes" Mar 08 03:48:22.091086 master-0 kubenswrapper[13046]: I0308 03:48:22.090350 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7fl4v"] Mar 08 03:48:22.104574 master-0 kubenswrapper[13046]: I0308 03:48:22.104520 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7fl4v"] Mar 08 03:48:22.136136 master-0 kubenswrapper[13046]: I0308 03:48:22.136007 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedf488b-a7e0-4a91-a6e8-d4cd25e33df6" path="/var/lib/kubelet/pods/cedf488b-a7e0-4a91-a6e8-d4cd25e33df6/volumes" Mar 08 03:48:23.650981 master-0 kubenswrapper[13046]: I0308 03:48:23.650887 13046 scope.go:117] "RemoveContainer" containerID="c7bba9327c811230db2351bff4a5762e0a574248f1888fc51e9a20c50f9972fb" Mar 08 03:48:23.683817 master-0 kubenswrapper[13046]: I0308 03:48:23.682224 13046 scope.go:117] "RemoveContainer" containerID="91498d4b392723401762fcc25df5e4008223bb59c3a5018aa4b5b71e88aba15a" Mar 08 03:48:23.741142 master-0 kubenswrapper[13046]: I0308 03:48:23.741100 13046 scope.go:117] "RemoveContainer" containerID="a52466d9de2c81b5a9aa547f69f813cb0e0514604ff17f5e8119d310d0a49e89" Mar 08 03:48:23.767937 master-0 kubenswrapper[13046]: I0308 03:48:23.767885 13046 scope.go:117] "RemoveContainer" containerID="0bc43cb2fb95a1524b067c67123e4f5b9af00b5c2abb9f0071b25bb073e375d3" Mar 08 03:48:23.795506 master-0 kubenswrapper[13046]: I0308 03:48:23.795355 13046 scope.go:117] "RemoveContainer" containerID="37f135cf2b0475a62ec0c2666445995a3e6aec1d5a46e1675726a69ce8cdc369" Mar 08 03:48:23.829354 master-0 kubenswrapper[13046]: I0308 03:48:23.829256 13046 scope.go:117] "RemoveContainer" containerID="a5b5a62d52c2d150f9d40c3c5dcbd8b35145551c7dc57bbf98937aa98ee8cd13" Mar 08 03:48:23.862180 master-0 kubenswrapper[13046]: I0308 03:48:23.862120 13046 scope.go:117] "RemoveContainer" containerID="fb162992e6bc528bee5f9c0586b2d640cbe00d84c7c185f91dffcdf27cbe0181" Mar 08 03:48:23.899799 master-0 kubenswrapper[13046]: I0308 03:48:23.899740 13046 scope.go:117] "RemoveContainer" containerID="ffcd9b52b57575139419d8472d69f62209c05803d8feecc51fdc332be824fbe6" Mar 08 03:48:23.929673 master-0 kubenswrapper[13046]: I0308 03:48:23.929604 13046 scope.go:117] "RemoveContainer" containerID="5f2feb18084b6b8394640bdd39cec82ea0afd0d4e5afeea24398dbf8c76cceb1" Mar 08 03:48:23.966323 master-0 kubenswrapper[13046]: I0308 03:48:23.966081 13046 scope.go:117] "RemoveContainer" containerID="94cdc5f29a6d860e7bc9f6764c6033f8792d94583a8fb39bfe4d486d089b390f" Mar 08 03:48:23.997135 master-0 kubenswrapper[13046]: I0308 03:48:23.996972 13046 scope.go:117] "RemoveContainer" containerID="1bcbec65c1a02bf8dcc7269e365f626ba1fcfc51e07bbf539eddcee7a1c8e922" Mar 08 03:48:27.062515 master-0 kubenswrapper[13046]: I0308 03:48:27.062224 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-k4mpt"] Mar 08 03:48:27.080517 master-0 kubenswrapper[13046]: I0308 03:48:27.079018 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-k4mpt"] Mar 08 03:48:28.147248 master-0 kubenswrapper[13046]: I0308 03:48:28.147152 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa4b3ae-ec1e-4819-a483-12a563171db2" path="/var/lib/kubelet/pods/9fa4b3ae-ec1e-4819-a483-12a563171db2/volumes" Mar 08 03:48:33.062987 master-0 kubenswrapper[13046]: I0308 03:48:33.062773 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e64dd-db-sync-xz5js"] Mar 08 03:48:33.076292 master-0 kubenswrapper[13046]: I0308 03:48:33.076220 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e64dd-db-sync-xz5js"] Mar 08 03:48:34.170507 master-0 kubenswrapper[13046]: I0308 03:48:34.167162 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888c100e-3bc9-45fa-a5a2-fe687ee09c1c" path="/var/lib/kubelet/pods/888c100e-3bc9-45fa-a5a2-fe687ee09c1c/volumes" Mar 08 03:48:44.073609 master-0 kubenswrapper[13046]: I0308 03:48:44.073524 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-j4bdp"] Mar 08 03:48:44.096530 master-0 kubenswrapper[13046]: I0308 03:48:44.092367 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-j4bdp"] Mar 08 03:48:44.142616 master-0 kubenswrapper[13046]: I0308 03:48:44.142526 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4248bb04-5f13-4afc-9263-49f3c929cd50" path="/var/lib/kubelet/pods/4248bb04-5f13-4afc-9263-49f3c929cd50/volumes" Mar 08 03:48:51.057966 master-0 kubenswrapper[13046]: I0308 03:48:51.057827 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-npd9v"] Mar 08 03:48:51.071249 master-0 kubenswrapper[13046]: I0308 03:48:51.071169 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-a807-account-create-update-j48bd"] Mar 08 03:48:51.082109 master-0 kubenswrapper[13046]: I0308 03:48:51.082036 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-npd9v"] Mar 08 03:48:51.092937 master-0 kubenswrapper[13046]: I0308 03:48:51.092868 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-a807-account-create-update-j48bd"] Mar 08 03:48:52.141614 master-0 kubenswrapper[13046]: I0308 03:48:52.141532 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a2178d-e786-49e5-995e-ac4b269e0089" path="/var/lib/kubelet/pods/12a2178d-e786-49e5-995e-ac4b269e0089/volumes" Mar 08 03:48:52.142662 master-0 kubenswrapper[13046]: I0308 03:48:52.142611 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3" path="/var/lib/kubelet/pods/ca72dcd1-1efe-4383-8f9e-3959c2f0a3a3/volumes" Mar 08 03:49:22.076842 master-0 kubenswrapper[13046]: I0308 03:49:22.076653 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-rcvjv"] Mar 08 03:49:22.095202 master-0 kubenswrapper[13046]: I0308 03:49:22.095108 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-rcvjv"] Mar 08 03:49:22.141505 master-0 kubenswrapper[13046]: I0308 03:49:22.141408 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327283ac-a839-484a-a4aa-daf30c72b9f4" path="/var/lib/kubelet/pods/327283ac-a839-484a-a4aa-daf30c72b9f4/volumes" Mar 08 03:49:24.291270 master-0 kubenswrapper[13046]: I0308 03:49:24.291146 13046 scope.go:117] "RemoveContainer" containerID="21c186c0e916378781593218621e52a7e8505b7e0fa558abdd4fb8dc561a04bd" Mar 08 03:49:24.332831 master-0 kubenswrapper[13046]: I0308 03:49:24.332781 13046 scope.go:117] "RemoveContainer" containerID="c3fb1d9c0b95ed93f930b2974708ee53f1d91cee4541038371bf84c41a407db5" Mar 08 03:49:24.379623 master-0 kubenswrapper[13046]: I0308 03:49:24.379574 13046 scope.go:117] "RemoveContainer" containerID="4d71bda5b031b2d9a24e5b6a59a0fc20622fc76fecb76d2cacb2685b9dfd5b5b" Mar 08 03:49:24.417406 master-0 kubenswrapper[13046]: I0308 03:49:24.417336 13046 scope.go:117] "RemoveContainer" containerID="7bb131816473e380629730641e3d66107e61fb8bfdc5c308aa971996501c926a" Mar 08 03:49:24.440774 master-0 kubenswrapper[13046]: I0308 03:49:24.440721 13046 scope.go:117] "RemoveContainer" containerID="a300fa2a572f7a256930ec382dbb5fb8085e21791741be6fe4b0f8b9cabbf9a4" Mar 08 03:49:24.477766 master-0 kubenswrapper[13046]: I0308 03:49:24.477724 13046 scope.go:117] "RemoveContainer" containerID="0cc79d5e6a4a278bdfa5a571e1ccea4389592109042ae880cffd0667a3d11640" Mar 08 03:49:24.502062 master-0 kubenswrapper[13046]: I0308 03:49:24.501905 13046 scope.go:117] "RemoveContainer" containerID="a07e2e878060a43c1b69f035fd7659c0fcbd40f297fb45c9fe13e1436bf2d989" Mar 08 03:49:25.053544 master-0 kubenswrapper[13046]: I0308 03:49:25.050899 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xds8t"] Mar 08 03:49:25.064339 master-0 kubenswrapper[13046]: I0308 03:49:25.064278 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xds8t"] Mar 08 03:49:26.053025 master-0 kubenswrapper[13046]: I0308 03:49:26.052966 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pdtcb"] Mar 08 03:49:26.071354 master-0 kubenswrapper[13046]: I0308 03:49:26.071197 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6x7dc"] Mar 08 03:49:26.081795 master-0 kubenswrapper[13046]: I0308 03:49:26.081728 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pdtcb"] Mar 08 03:49:26.094793 master-0 kubenswrapper[13046]: I0308 03:49:26.094682 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6x7dc"] Mar 08 03:49:26.140866 master-0 kubenswrapper[13046]: I0308 03:49:26.140803 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8ddcef-7329-4111-b9bd-c4ce0d6bf682" path="/var/lib/kubelet/pods/7b8ddcef-7329-4111-b9bd-c4ce0d6bf682/volumes" Mar 08 03:49:26.142065 master-0 kubenswrapper[13046]: I0308 03:49:26.142036 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11" path="/var/lib/kubelet/pods/a9b8c1f0-614d-4fc5-a3c4-a9c1e0893d11/volumes" Mar 08 03:49:26.144191 master-0 kubenswrapper[13046]: I0308 03:49:26.144150 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db81925e-2987-49d2-a511-33a9f40ddd8c" path="/var/lib/kubelet/pods/db81925e-2987-49d2-a511-33a9f40ddd8c/volumes" Mar 08 03:49:27.059400 master-0 kubenswrapper[13046]: I0308 03:49:27.059289 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4cfc-account-create-update-sf2rb"] Mar 08 03:49:27.082849 master-0 kubenswrapper[13046]: I0308 03:49:27.082466 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4cfc-account-create-update-sf2rb"] Mar 08 03:49:28.059652 master-0 kubenswrapper[13046]: I0308 03:49:28.059591 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2420-account-create-update-mb88p"] Mar 08 03:49:28.076830 master-0 kubenswrapper[13046]: I0308 03:49:28.076767 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-65fa-account-create-update-rktss"] Mar 08 03:49:28.089462 master-0 kubenswrapper[13046]: I0308 03:49:28.089407 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2420-account-create-update-mb88p"] Mar 08 03:49:28.100267 master-0 kubenswrapper[13046]: I0308 03:49:28.100211 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-65fa-account-create-update-rktss"] Mar 08 03:49:28.135438 master-0 kubenswrapper[13046]: I0308 03:49:28.135362 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3459cf-9e53-466c-906d-6a9033b782f1" path="/var/lib/kubelet/pods/2f3459cf-9e53-466c-906d-6a9033b782f1/volumes" Mar 08 03:49:28.136232 master-0 kubenswrapper[13046]: I0308 03:49:28.136199 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bc77fe-5af2-4fe1-b7d5-321250be828e" path="/var/lib/kubelet/pods/38bc77fe-5af2-4fe1-b7d5-321250be828e/volumes" Mar 08 03:49:28.137845 master-0 kubenswrapper[13046]: I0308 03:49:28.137787 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9dfc00-d4d2-454d-b76f-076660d4f9e2" path="/var/lib/kubelet/pods/5f9dfc00-d4d2-454d-b76f-076660d4f9e2/volumes" Mar 08 03:50:03.086412 master-0 kubenswrapper[13046]: I0308 03:50:03.084521 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ckdg7"] Mar 08 03:50:03.109135 master-0 kubenswrapper[13046]: I0308 03:50:03.109049 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ckdg7"] Mar 08 03:50:04.134638 master-0 kubenswrapper[13046]: I0308 03:50:04.134579 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510d6b1b-5eec-47ab-ba92-23ef35ec5f83" path="/var/lib/kubelet/pods/510d6b1b-5eec-47ab-ba92-23ef35ec5f83/volumes" Mar 08 03:50:24.684479 master-0 kubenswrapper[13046]: I0308 03:50:24.684389 13046 scope.go:117] "RemoveContainer" containerID="011294082c37a5329c4e22fb7c0aef8f312655fabcd99e66c51dbe3fd5a0d2e3" Mar 08 03:50:24.721361 master-0 kubenswrapper[13046]: I0308 03:50:24.721201 13046 scope.go:117] "RemoveContainer" containerID="4279058fa337d712d6af823b86ef9aaef600530db00fb1400bb2661a453c5f26" Mar 08 03:50:24.756235 master-0 kubenswrapper[13046]: I0308 03:50:24.756141 13046 scope.go:117] "RemoveContainer" containerID="2c55c4fb9c09ccade0b205372432f0fe3032f53ceb514771f77876852142d13c" Mar 08 03:50:24.795910 master-0 kubenswrapper[13046]: I0308 03:50:24.795847 13046 scope.go:117] "RemoveContainer" containerID="364f37c6908869497f0c2bd6e4b77df4d1c50c84eb6f4b98ccf89b3ea1b682f3" Mar 08 03:50:24.821767 master-0 kubenswrapper[13046]: I0308 03:50:24.821706 13046 scope.go:117] "RemoveContainer" containerID="de5ab68dcd6866ba06ec2863e3e418b43b43b2b583d1232424b8d5115760880a" Mar 08 03:50:24.851440 master-0 kubenswrapper[13046]: I0308 03:50:24.851372 13046 scope.go:117] "RemoveContainer" containerID="c881448e5ca03aa12111fdc11156330c6818f43b383ccc8c8aaed17fca718b22" Mar 08 03:50:24.902120 master-0 kubenswrapper[13046]: I0308 03:50:24.902014 13046 scope.go:117] "RemoveContainer" containerID="2f3414ba3620c1892aedff07a5a4ed1960c34f3a19ab0fa0b736b988f75cf166" Mar 08 03:50:31.090900 master-0 kubenswrapper[13046]: I0308 03:50:31.090812 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hrdjm"] Mar 08 03:50:31.104301 master-0 kubenswrapper[13046]: I0308 03:50:31.104190 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hrdjm"] Mar 08 03:50:32.054602 master-0 kubenswrapper[13046]: I0308 03:50:32.054361 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmrk"] Mar 08 03:50:32.069567 master-0 kubenswrapper[13046]: I0308 03:50:32.069416 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lnmrk"] Mar 08 03:50:32.141706 master-0 kubenswrapper[13046]: I0308 03:50:32.141420 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854672de-d415-46d9-810b-5f7d085f1969" path="/var/lib/kubelet/pods/854672de-d415-46d9-810b-5f7d085f1969/volumes" Mar 08 03:50:32.142609 master-0 kubenswrapper[13046]: I0308 03:50:32.142253 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bd0477-6b38-4d58-b53b-34c4c323496b" path="/var/lib/kubelet/pods/b3bd0477-6b38-4d58-b53b-34c4c323496b/volumes" Mar 08 03:51:10.090416 master-0 kubenswrapper[13046]: I0308 03:51:10.090361 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-jgj9r"] Mar 08 03:51:10.107452 master-0 kubenswrapper[13046]: I0308 03:51:10.107386 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-jgj9r"] Mar 08 03:51:10.147459 master-0 kubenswrapper[13046]: I0308 03:51:10.147385 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bdf11b-0b56-43f0-8eb7-e15efda5ae67" path="/var/lib/kubelet/pods/65bdf11b-0b56-43f0-8eb7-e15efda5ae67/volumes" Mar 08 03:51:12.053942 master-0 kubenswrapper[13046]: I0308 03:51:12.053797 13046 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8n5sd"] Mar 08 03:51:12.072393 master-0 kubenswrapper[13046]: I0308 03:51:12.072301 13046 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8n5sd"] Mar 08 03:51:12.141563 master-0 kubenswrapper[13046]: I0308 03:51:12.141459 13046 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73db713-dc45-42da-a26c-a6cf8d83821a" path="/var/lib/kubelet/pods/c73db713-dc45-42da-a26c-a6cf8d83821a/volumes" Mar 08 03:51:25.109510 master-0 kubenswrapper[13046]: I0308 03:51:25.109414 13046 scope.go:117] "RemoveContainer" containerID="77a9e41b555abba36489ce3052653b67e27e7fb12a9ba0583a2d3384a02f51f1" Mar 08 03:51:25.153852 master-0 kubenswrapper[13046]: I0308 03:51:25.153780 13046 scope.go:117] "RemoveContainer" containerID="297f43b318034f13eb7bc5d70e2e65b057d40949d2c0f3a73b72abdc005a922b" Mar 08 03:51:25.198733 master-0 kubenswrapper[13046]: I0308 03:51:25.198660 13046 scope.go:117] "RemoveContainer" containerID="aec03aeaa079dff2f7d1817269e1f7bd4840803da8e785309968031d87af9708" Mar 08 03:51:25.233675 master-0 kubenswrapper[13046]: I0308 03:51:25.233631 13046 scope.go:117] "RemoveContainer" containerID="8bedf1e50c5add1110f8c79fef570aa9c2d855d1823c1c3b5a7186f415883c04" Mar 08 03:55:32.336732 master-0 kubenswrapper[13046]: E0308 03:55:32.336558 13046 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:36910->192.168.32.10:33351: write tcp 192.168.32.10:36910->192.168.32.10:33351: write: connection reset by peer Mar 08 04:01:00.233687 master-0 kubenswrapper[13046]: I0308 04:01:00.232890 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29549041-ckb6v"] Mar 08 04:01:00.235929 master-0 kubenswrapper[13046]: I0308 04:01:00.235872 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.281449 master-0 kubenswrapper[13046]: I0308 04:01:00.280555 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549041-ckb6v"] Mar 08 04:01:00.377058 master-0 kubenswrapper[13046]: I0308 04:01:00.376979 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.377254 master-0 kubenswrapper[13046]: I0308 04:01:00.377150 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d287h\" (UniqueName: \"kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.377254 master-0 kubenswrapper[13046]: I0308 04:01:00.377233 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.377404 master-0 kubenswrapper[13046]: I0308 04:01:00.377372 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.479608 master-0 kubenswrapper[13046]: I0308 04:01:00.479504 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.479842 master-0 kubenswrapper[13046]: I0308 04:01:00.479623 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.479842 master-0 kubenswrapper[13046]: I0308 04:01:00.479773 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d287h\" (UniqueName: \"kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.480040 master-0 kubenswrapper[13046]: I0308 04:01:00.480003 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.483807 master-0 kubenswrapper[13046]: I0308 04:01:00.483724 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.487238 master-0 kubenswrapper[13046]: I0308 04:01:00.487080 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.488215 master-0 kubenswrapper[13046]: I0308 04:01:00.488175 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.505681 master-0 kubenswrapper[13046]: I0308 04:01:00.505625 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d287h\" (UniqueName: \"kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h\") pod \"keystone-cron-29549041-ckb6v\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:00.578884 master-0 kubenswrapper[13046]: I0308 04:01:00.578837 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:01.179438 master-0 kubenswrapper[13046]: I0308 04:01:01.179369 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29549041-ckb6v"] Mar 08 04:01:01.421321 master-0 kubenswrapper[13046]: I0308 04:01:01.421192 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549041-ckb6v" event={"ID":"4fc77ddd-0e59-4f30-96d2-380b1543777a","Type":"ContainerStarted","Data":"da871582e9e292f2b4a11925e20dcd6a8a876197f72e18b5ae8d86a9e261cd90"} Mar 08 04:01:01.421321 master-0 kubenswrapper[13046]: I0308 04:01:01.421265 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549041-ckb6v" event={"ID":"4fc77ddd-0e59-4f30-96d2-380b1543777a","Type":"ContainerStarted","Data":"a4b04de5c2de766dd45d3973eb128e66c4e67a10c18e5e5fe297796386cba2b4"} Mar 08 04:01:01.457925 master-0 kubenswrapper[13046]: I0308 04:01:01.457836 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29549041-ckb6v" podStartSLOduration=1.45780255 podStartE2EDuration="1.45780255s" podCreationTimestamp="2026-03-08 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:01:01.447147017 +0000 UTC m=+2863.525914264" watchObservedRunningTime="2026-03-08 04:01:01.45780255 +0000 UTC m=+2863.536569767" Mar 08 04:01:04.468199 master-0 kubenswrapper[13046]: I0308 04:01:04.468031 13046 generic.go:334] "Generic (PLEG): container finished" podID="4fc77ddd-0e59-4f30-96d2-380b1543777a" containerID="da871582e9e292f2b4a11925e20dcd6a8a876197f72e18b5ae8d86a9e261cd90" exitCode=0 Mar 08 04:01:04.468199 master-0 kubenswrapper[13046]: I0308 04:01:04.468096 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549041-ckb6v" event={"ID":"4fc77ddd-0e59-4f30-96d2-380b1543777a","Type":"ContainerDied","Data":"da871582e9e292f2b4a11925e20dcd6a8a876197f72e18b5ae8d86a9e261cd90"} Mar 08 04:01:06.034958 master-0 kubenswrapper[13046]: I0308 04:01:06.034878 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:01:06.141206 master-0 kubenswrapper[13046]: I0308 04:01:06.141125 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") pod \"4fc77ddd-0e59-4f30-96d2-380b1543777a\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " Mar 08 04:01:06.141686 master-0 kubenswrapper[13046]: I0308 04:01:06.141634 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle\") pod \"4fc77ddd-0e59-4f30-96d2-380b1543777a\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " Mar 08 04:01:06.141810 master-0 kubenswrapper[13046]: I0308 04:01:06.141717 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys\") pod \"4fc77ddd-0e59-4f30-96d2-380b1543777a\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " Mar 08 04:01:06.142020 master-0 kubenswrapper[13046]: I0308 04:01:06.141942 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d287h\" (UniqueName: \"kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h\") pod \"4fc77ddd-0e59-4f30-96d2-380b1543777a\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " Mar 08 04:01:06.145137 master-0 kubenswrapper[13046]: I0308 04:01:06.145085 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h" (OuterVolumeSpecName: "kube-api-access-d287h") pod "4fc77ddd-0e59-4f30-96d2-380b1543777a" (UID: "4fc77ddd-0e59-4f30-96d2-380b1543777a"). InnerVolumeSpecName "kube-api-access-d287h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 04:01:06.148862 master-0 kubenswrapper[13046]: I0308 04:01:06.148787 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4fc77ddd-0e59-4f30-96d2-380b1543777a" (UID: "4fc77ddd-0e59-4f30-96d2-380b1543777a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:01:06.172932 master-0 kubenswrapper[13046]: I0308 04:01:06.172836 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fc77ddd-0e59-4f30-96d2-380b1543777a" (UID: "4fc77ddd-0e59-4f30-96d2-380b1543777a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:01:06.245687 master-0 kubenswrapper[13046]: I0308 04:01:06.244396 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data" (OuterVolumeSpecName: "config-data") pod "4fc77ddd-0e59-4f30-96d2-380b1543777a" (UID: "4fc77ddd-0e59-4f30-96d2-380b1543777a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:01:06.245687 master-0 kubenswrapper[13046]: I0308 04:01:06.244921 13046 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") pod \"4fc77ddd-0e59-4f30-96d2-380b1543777a\" (UID: \"4fc77ddd-0e59-4f30-96d2-380b1543777a\") " Mar 08 04:01:06.245687 master-0 kubenswrapper[13046]: W0308 04:01:06.245114 13046 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4fc77ddd-0e59-4f30-96d2-380b1543777a/volumes/kubernetes.io~secret/config-data Mar 08 04:01:06.245687 master-0 kubenswrapper[13046]: I0308 04:01:06.245129 13046 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data" (OuterVolumeSpecName: "config-data") pod "4fc77ddd-0e59-4f30-96d2-380b1543777a" (UID: "4fc77ddd-0e59-4f30-96d2-380b1543777a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 04:01:06.246752 master-0 kubenswrapper[13046]: I0308 04:01:06.246457 13046 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 08 04:01:06.246752 master-0 kubenswrapper[13046]: I0308 04:01:06.246508 13046 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 08 04:01:06.246752 master-0 kubenswrapper[13046]: I0308 04:01:06.246518 13046 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d287h\" (UniqueName: \"kubernetes.io/projected/4fc77ddd-0e59-4f30-96d2-380b1543777a-kube-api-access-d287h\") on node \"master-0\" DevicePath \"\"" Mar 08 04:01:06.246752 master-0 kubenswrapper[13046]: I0308 04:01:06.246531 13046 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc77ddd-0e59-4f30-96d2-380b1543777a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 08 04:01:06.508862 master-0 kubenswrapper[13046]: I0308 04:01:06.508724 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29549041-ckb6v" event={"ID":"4fc77ddd-0e59-4f30-96d2-380b1543777a","Type":"ContainerDied","Data":"a4b04de5c2de766dd45d3973eb128e66c4e67a10c18e5e5fe297796386cba2b4"} Mar 08 04:01:06.508862 master-0 kubenswrapper[13046]: I0308 04:01:06.508791 13046 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b04de5c2de766dd45d3973eb128e66c4e67a10c18e5e5fe297796386cba2b4" Mar 08 04:01:06.508862 master-0 kubenswrapper[13046]: I0308 04:01:06.508810 13046 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29549041-ckb6v" Mar 08 04:04:49.708519 master-0 kubenswrapper[13046]: I0308 04:04:49.707591 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2xfr/must-gather-k6nrm"] Mar 08 04:04:49.708519 master-0 kubenswrapper[13046]: E0308 04:04:49.708250 13046 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc77ddd-0e59-4f30-96d2-380b1543777a" containerName="keystone-cron" Mar 08 04:04:49.708519 master-0 kubenswrapper[13046]: I0308 04:04:49.708270 13046 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc77ddd-0e59-4f30-96d2-380b1543777a" containerName="keystone-cron" Mar 08 04:04:49.709298 master-0 kubenswrapper[13046]: I0308 04:04:49.708637 13046 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc77ddd-0e59-4f30-96d2-380b1543777a" containerName="keystone-cron" Mar 08 04:04:49.710932 master-0 kubenswrapper[13046]: I0308 04:04:49.710171 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.714510 master-0 kubenswrapper[13046]: I0308 04:04:49.712728 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h2xfr"/"openshift-service-ca.crt" Mar 08 04:04:49.714510 master-0 kubenswrapper[13046]: I0308 04:04:49.713060 13046 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h2xfr"/"kube-root-ca.crt" Mar 08 04:04:49.720638 master-0 kubenswrapper[13046]: I0308 04:04:49.719603 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2xfr/must-gather-5gpqv"] Mar 08 04:04:49.723329 master-0 kubenswrapper[13046]: I0308 04:04:49.721518 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.753518 master-0 kubenswrapper[13046]: I0308 04:04:49.751643 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/must-gather-k6nrm"] Mar 08 04:04:49.764507 master-0 kubenswrapper[13046]: I0308 04:04:49.763403 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebb1cb39-3226-4730-9562-19f0d0018eca-must-gather-output\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.764507 master-0 kubenswrapper[13046]: I0308 04:04:49.763459 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2lv\" (UniqueName: \"kubernetes.io/projected/ebb1cb39-3226-4730-9562-19f0d0018eca-kube-api-access-sh2lv\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.764507 master-0 kubenswrapper[13046]: I0308 04:04:49.763513 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ebb0e2d-c17b-434f-8dae-2303884e02f6-must-gather-output\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.764507 master-0 kubenswrapper[13046]: I0308 04:04:49.764085 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxcrs\" (UniqueName: \"kubernetes.io/projected/5ebb0e2d-c17b-434f-8dae-2303884e02f6-kube-api-access-wxcrs\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.772513 master-0 kubenswrapper[13046]: I0308 04:04:49.767944 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/must-gather-5gpqv"] Mar 08 04:04:49.867150 master-0 kubenswrapper[13046]: I0308 04:04:49.866472 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxcrs\" (UniqueName: \"kubernetes.io/projected/5ebb0e2d-c17b-434f-8dae-2303884e02f6-kube-api-access-wxcrs\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.867150 master-0 kubenswrapper[13046]: I0308 04:04:49.866624 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebb1cb39-3226-4730-9562-19f0d0018eca-must-gather-output\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.867150 master-0 kubenswrapper[13046]: I0308 04:04:49.866655 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2lv\" (UniqueName: \"kubernetes.io/projected/ebb1cb39-3226-4730-9562-19f0d0018eca-kube-api-access-sh2lv\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.867150 master-0 kubenswrapper[13046]: I0308 04:04:49.866712 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ebb0e2d-c17b-434f-8dae-2303884e02f6-must-gather-output\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.873380 master-0 kubenswrapper[13046]: I0308 04:04:49.873322 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ebb1cb39-3226-4730-9562-19f0d0018eca-must-gather-output\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.874544 master-0 kubenswrapper[13046]: I0308 04:04:49.874501 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ebb0e2d-c17b-434f-8dae-2303884e02f6-must-gather-output\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:49.904533 master-0 kubenswrapper[13046]: I0308 04:04:49.897784 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2lv\" (UniqueName: \"kubernetes.io/projected/ebb1cb39-3226-4730-9562-19f0d0018eca-kube-api-access-sh2lv\") pod \"must-gather-k6nrm\" (UID: \"ebb1cb39-3226-4730-9562-19f0d0018eca\") " pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:49.908583 master-0 kubenswrapper[13046]: I0308 04:04:49.907294 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxcrs\" (UniqueName: \"kubernetes.io/projected/5ebb0e2d-c17b-434f-8dae-2303884e02f6-kube-api-access-wxcrs\") pod \"must-gather-5gpqv\" (UID: \"5ebb0e2d-c17b-434f-8dae-2303884e02f6\") " pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:50.062740 master-0 kubenswrapper[13046]: I0308 04:04:50.062600 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" Mar 08 04:04:50.089517 master-0 kubenswrapper[13046]: I0308 04:04:50.089438 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" Mar 08 04:04:50.640698 master-0 kubenswrapper[13046]: I0308 04:04:50.640616 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/must-gather-k6nrm"] Mar 08 04:04:50.656883 master-0 kubenswrapper[13046]: W0308 04:04:50.656013 13046 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebb1cb39_3226_4730_9562_19f0d0018eca.slice/crio-c92d7b9c7493295203eae607d28c0bf6572230a6dd91eed2ca28ee83f2f4b1cf WatchSource:0}: Error finding container c92d7b9c7493295203eae607d28c0bf6572230a6dd91eed2ca28ee83f2f4b1cf: Status 404 returned error can't find the container with id c92d7b9c7493295203eae607d28c0bf6572230a6dd91eed2ca28ee83f2f4b1cf Mar 08 04:04:50.665432 master-0 kubenswrapper[13046]: I0308 04:04:50.665393 13046 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 04:04:50.781918 master-0 kubenswrapper[13046]: I0308 04:04:50.777242 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/must-gather-5gpqv"] Mar 08 04:04:50.809007 master-0 kubenswrapper[13046]: I0308 04:04:50.799607 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" event={"ID":"ebb1cb39-3226-4730-9562-19f0d0018eca","Type":"ContainerStarted","Data":"c92d7b9c7493295203eae607d28c0bf6572230a6dd91eed2ca28ee83f2f4b1cf"} Mar 08 04:04:51.817570 master-0 kubenswrapper[13046]: I0308 04:04:51.817472 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" event={"ID":"5ebb0e2d-c17b-434f-8dae-2303884e02f6","Type":"ContainerStarted","Data":"b33b94f5eda83d81cac36ab1ee1b5917095f163eba0b1124423c4d22086de455"} Mar 08 04:04:52.840277 master-0 kubenswrapper[13046]: I0308 04:04:52.840226 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" event={"ID":"5ebb0e2d-c17b-434f-8dae-2303884e02f6","Type":"ContainerStarted","Data":"327142fff3425677a35fb227f42ec1dce1c94b645108c3aa599573150fe61070"} Mar 08 04:04:53.672321 master-0 kubenswrapper[13046]: I0308 04:04:53.672260 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-2z74v_af653e87-ce5f-4f1a-a20d-233c563694ba/cluster-version-operator/0.log" Mar 08 04:04:53.854820 master-0 kubenswrapper[13046]: I0308 04:04:53.854750 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" event={"ID":"5ebb0e2d-c17b-434f-8dae-2303884e02f6","Type":"ContainerStarted","Data":"0e36da26e9e8193243bcbf825a6efcc999bf2ef087cd00368aede99a6189cffb"} Mar 08 04:04:53.890624 master-0 kubenswrapper[13046]: I0308 04:04:53.884152 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2xfr/must-gather-5gpqv" podStartSLOduration=3.700228991 podStartE2EDuration="4.884116256s" podCreationTimestamp="2026-03-08 04:04:49 +0000 UTC" firstStartedPulling="2026-03-08 04:04:50.791696776 +0000 UTC m=+3092.870463993" lastFinishedPulling="2026-03-08 04:04:51.975584041 +0000 UTC m=+3094.054351258" observedRunningTime="2026-03-08 04:04:53.874060152 +0000 UTC m=+3095.952827369" watchObservedRunningTime="2026-03-08 04:04:53.884116256 +0000 UTC m=+3095.962883473" Mar 08 04:04:54.907017 master-0 kubenswrapper[13046]: I0308 04:04:54.906957 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-2z74v_af653e87-ce5f-4f1a-a20d-233c563694ba/cluster-version-operator/1.log" Mar 08 04:04:58.476691 master-0 kubenswrapper[13046]: I0308 04:04:58.476008 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-gkhw9_b3fc40be-0506-4106-86f1-4ea0b3a66734/nmstate-console-plugin/0.log" Mar 08 04:04:58.504566 master-0 kubenswrapper[13046]: I0308 04:04:58.502909 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rn6cd_5532eb26-c3d4-40a9-a0d8-3794569ef44b/nmstate-handler/0.log" Mar 08 04:04:58.541576 master-0 kubenswrapper[13046]: I0308 04:04:58.541208 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-7r4s5_1ca2dfb7-04ed-4252-a024-0287ba87ff9f/nmstate-metrics/0.log" Mar 08 04:04:58.563163 master-0 kubenswrapper[13046]: I0308 04:04:58.563116 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/controller/0.log" Mar 08 04:04:58.568828 master-0 kubenswrapper[13046]: I0308 04:04:58.568799 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/kube-rbac-proxy/0.log" Mar 08 04:04:58.597042 master-0 kubenswrapper[13046]: I0308 04:04:58.596846 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-7r4s5_1ca2dfb7-04ed-4252-a024-0287ba87ff9f/kube-rbac-proxy/0.log" Mar 08 04:04:58.618619 master-0 kubenswrapper[13046]: I0308 04:04:58.618584 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/controller/0.log" Mar 08 04:04:58.621192 master-0 kubenswrapper[13046]: I0308 04:04:58.621165 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-vhkqq_a123bab3-dc4d-42e5-a156-8dc6c3612334/nmstate-operator/0.log" Mar 08 04:04:58.664675 master-0 kubenswrapper[13046]: I0308 04:04:58.664632 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-gpxzx_e961965c-77d6-4dd8-b731-ecbdd4ef035d/nmstate-webhook/0.log" Mar 08 04:04:59.511345 master-0 kubenswrapper[13046]: I0308 04:04:59.511292 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr/0.log" Mar 08 04:04:59.528728 master-0 kubenswrapper[13046]: I0308 04:04:59.528680 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/reloader/0.log" Mar 08 04:04:59.536639 master-0 kubenswrapper[13046]: I0308 04:04:59.535886 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr-metrics/0.log" Mar 08 04:04:59.546694 master-0 kubenswrapper[13046]: I0308 04:04:59.546641 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy/0.log" Mar 08 04:04:59.569569 master-0 kubenswrapper[13046]: I0308 04:04:59.568518 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy-frr/0.log" Mar 08 04:04:59.583356 master-0 kubenswrapper[13046]: I0308 04:04:59.582662 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-frr-files/0.log" Mar 08 04:04:59.603284 master-0 kubenswrapper[13046]: I0308 04:04:59.600451 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-reloader/0.log" Mar 08 04:04:59.610430 master-0 kubenswrapper[13046]: I0308 04:04:59.610383 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-metrics/0.log" Mar 08 04:04:59.629834 master-0 kubenswrapper[13046]: I0308 04:04:59.629386 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-546wf_2b5bd505-a6ef-490d-b7b4-83412df76a4f/frr-k8s-webhook-server/0.log" Mar 08 04:04:59.663904 master-0 kubenswrapper[13046]: I0308 04:04:59.663059 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68cfc6845d-mhm6t_e1d15c8d-0326-4e12-bdba-ed6df8b88ed0/manager/0.log" Mar 08 04:04:59.676272 master-0 kubenswrapper[13046]: I0308 04:04:59.676228 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fb7496f9c-6jvkl_738a1cd1-8f37-4d94-abeb-36e19b8653b3/webhook-server/0.log" Mar 08 04:05:00.153345 master-0 kubenswrapper[13046]: I0308 04:05:00.152281 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/speaker/0.log" Mar 08 04:05:00.167838 master-0 kubenswrapper[13046]: I0308 04:05:00.167782 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/kube-rbac-proxy/0.log" Mar 08 04:05:00.809559 master-0 kubenswrapper[13046]: I0308 04:05:00.809509 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 08 04:05:01.014719 master-0 kubenswrapper[13046]: I0308 04:05:01.014360 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 08 04:05:01.033022 master-0 kubenswrapper[13046]: I0308 04:05:01.032926 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 08 04:05:01.059502 master-0 kubenswrapper[13046]: I0308 04:05:01.058534 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 08 04:05:01.078622 master-0 kubenswrapper[13046]: I0308 04:05:01.078515 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 08 04:05:01.105411 master-0 kubenswrapper[13046]: I0308 04:05:01.105360 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 08 04:05:01.116127 master-0 kubenswrapper[13046]: I0308 04:05:01.115877 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 08 04:05:01.142564 master-0 kubenswrapper[13046]: I0308 04:05:01.142265 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 08 04:05:01.164551 master-0 kubenswrapper[13046]: I0308 04:05:01.164504 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_2dc664e3-7f37-4fba-8104-544ffb18c1bd/installer/0.log" Mar 08 04:05:01.198609 master-0 kubenswrapper[13046]: I0308 04:05:01.198565 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_dbe1bc10-8da1-48fc-a9f0-089154ab30e3/installer/0.log" Mar 08 04:05:02.421966 master-0 kubenswrapper[13046]: I0308 04:05:02.421907 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-9g2h9_a5abe9d1-62c2-4d7e-9b77-403ea0cfbbf5/assisted-installer-controller/0.log" Mar 08 04:05:02.839339 master-0 kubenswrapper[13046]: I0308 04:05:02.839290 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-d69ccb978-jj8tj_8b8268e3-34e6-4672-b6ab-d9f93dd788d7/oauth-openshift/0.log" Mar 08 04:05:03.033362 master-0 kubenswrapper[13046]: I0308 04:05:03.033007 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" event={"ID":"ebb1cb39-3226-4730-9562-19f0d0018eca","Type":"ContainerStarted","Data":"934348b4aa4fe4eb4d140f7a008145b8410f5d3a30b5c2c4e206e42bdb98a4a6"} Mar 08 04:05:03.033362 master-0 kubenswrapper[13046]: I0308 04:05:03.033079 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" event={"ID":"ebb1cb39-3226-4730-9562-19f0d0018eca","Type":"ContainerStarted","Data":"aa5f69e3f0ec229431872c5e8ba463426526b63a79443b297071f38d6a842622"} Mar 08 04:05:03.071511 master-0 kubenswrapper[13046]: I0308 04:05:03.069712 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2xfr/must-gather-k6nrm" podStartSLOduration=2.910764829 podStartE2EDuration="14.069696036s" podCreationTimestamp="2026-03-08 04:04:49 +0000 UTC" firstStartedPulling="2026-03-08 04:04:50.665261301 +0000 UTC m=+3092.744028558" lastFinishedPulling="2026-03-08 04:05:01.824192538 +0000 UTC m=+3103.902959765" observedRunningTime="2026-03-08 04:05:03.061267168 +0000 UTC m=+3105.140034385" watchObservedRunningTime="2026-03-08 04:05:03.069696036 +0000 UTC m=+3105.148463253" Mar 08 04:05:04.092186 master-0 kubenswrapper[13046]: I0308 04:05:04.091785 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-zqlnx_f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/authentication-operator/1.log" Mar 08 04:05:04.151787 master-0 kubenswrapper[13046]: I0308 04:05:04.151735 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-zqlnx_f08a644f-3b61-46a7-a7b6-a9f7f2f7d266/authentication-operator/2.log" Mar 08 04:05:05.150441 master-0 kubenswrapper[13046]: I0308 04:05:05.150392 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79f8cd6fdd-5s2m2_afd61ed2-3f0b-4f56-a99a-d93145461181/router/0.log" Mar 08 04:05:06.177347 master-0 kubenswrapper[13046]: I0308 04:05:06.177198 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc"] Mar 08 04:05:06.185954 master-0 kubenswrapper[13046]: I0308 04:05:06.180685 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.218567 master-0 kubenswrapper[13046]: I0308 04:05:06.208575 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc"] Mar 08 04:05:06.279297 master-0 kubenswrapper[13046]: I0308 04:05:06.278419 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-fb55b5d5d-pm69n_dac2b210-2fbb-4d25-a0ea-1825259cee3b/oauth-apiserver/0.log" Mar 08 04:05:06.293880 master-0 kubenswrapper[13046]: I0308 04:05:06.293842 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-fb55b5d5d-pm69n_dac2b210-2fbb-4d25-a0ea-1825259cee3b/fix-audit-permissions/0.log" Mar 08 04:05:06.294706 master-0 kubenswrapper[13046]: I0308 04:05:06.294671 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-proc\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.294912 master-0 kubenswrapper[13046]: I0308 04:05:06.294895 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-lib-modules\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.295020 master-0 kubenswrapper[13046]: I0308 04:05:06.295007 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-sys\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.295104 master-0 kubenswrapper[13046]: I0308 04:05:06.295091 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmpm7\" (UniqueName: \"kubernetes.io/projected/165dfe64-d572-4cc6-ae49-89d976403a21-kube-api-access-mmpm7\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.295252 master-0 kubenswrapper[13046]: I0308 04:05:06.295238 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-podres\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.396931 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-proc\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397098 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-lib-modules\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397134 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-sys\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397164 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmpm7\" (UniqueName: \"kubernetes.io/projected/165dfe64-d572-4cc6-ae49-89d976403a21-kube-api-access-mmpm7\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397213 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-podres\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397349 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-podres\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397396 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-proc\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397445 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-lib-modules\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.397510 master-0 kubenswrapper[13046]: I0308 04:05:06.397478 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/165dfe64-d572-4cc6-ae49-89d976403a21-sys\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.415140 master-0 kubenswrapper[13046]: I0308 04:05:06.415090 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmpm7\" (UniqueName: \"kubernetes.io/projected/165dfe64-d572-4cc6-ae49-89d976403a21-kube-api-access-mmpm7\") pod \"perf-node-gather-daemonset-p9nkc\" (UID: \"165dfe64-d572-4cc6-ae49-89d976403a21\") " pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.533544 master-0 kubenswrapper[13046]: I0308 04:05:06.533407 13046 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2xfr/master-0-debug-df247"] Mar 08 04:05:06.534882 master-0 kubenswrapper[13046]: I0308 04:05:06.534853 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.536340 master-0 kubenswrapper[13046]: I0308 04:05:06.536257 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:06.706509 master-0 kubenswrapper[13046]: I0308 04:05:06.705475 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797d8166-b6bf-416b-97b5-59c98fdad8ba-host\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.706509 master-0 kubenswrapper[13046]: I0308 04:05:06.706270 13046 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh95f\" (UniqueName: \"kubernetes.io/projected/797d8166-b6bf-416b-97b5-59c98fdad8ba-kube-api-access-zh95f\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.808761 master-0 kubenswrapper[13046]: I0308 04:05:06.808597 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh95f\" (UniqueName: \"kubernetes.io/projected/797d8166-b6bf-416b-97b5-59c98fdad8ba-kube-api-access-zh95f\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.808988 master-0 kubenswrapper[13046]: I0308 04:05:06.808779 13046 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797d8166-b6bf-416b-97b5-59c98fdad8ba-host\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.808988 master-0 kubenswrapper[13046]: I0308 04:05:06.808875 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/797d8166-b6bf-416b-97b5-59c98fdad8ba-host\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.824642 master-0 kubenswrapper[13046]: I0308 04:05:06.824209 13046 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh95f\" (UniqueName: \"kubernetes.io/projected/797d8166-b6bf-416b-97b5-59c98fdad8ba-kube-api-access-zh95f\") pod \"master-0-debug-df247\" (UID: \"797d8166-b6bf-416b-97b5-59c98fdad8ba\") " pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:06.867648 master-0 kubenswrapper[13046]: I0308 04:05:06.867473 13046 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2xfr/master-0-debug-df247" Mar 08 04:05:07.122779 master-0 kubenswrapper[13046]: I0308 04:05:07.121432 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/master-0-debug-df247" event={"ID":"797d8166-b6bf-416b-97b5-59c98fdad8ba","Type":"ContainerStarted","Data":"112ec63af8ac68ac370262f79863a23c8fad618183db336ce494881011dbc7e6"} Mar 08 04:05:07.131614 master-0 kubenswrapper[13046]: I0308 04:05:07.129695 13046 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc"] Mar 08 04:05:07.412244 master-0 kubenswrapper[13046]: I0308 04:05:07.412138 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/kube-rbac-proxy/0.log" Mar 08 04:05:07.432165 master-0 kubenswrapper[13046]: I0308 04:05:07.432118 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/2.log" Mar 08 04:05:07.461198 master-0 kubenswrapper[13046]: I0308 04:05:07.461067 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-cpnw6_17eaab63-9ba9-4a4a-891d-a76aa3f03b46/cluster-autoscaler-operator/3.log" Mar 08 04:05:07.476864 master-0 kubenswrapper[13046]: I0308 04:05:07.476820 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/3.log" Mar 08 04:05:07.477937 master-0 kubenswrapper[13046]: I0308 04:05:07.477900 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/cluster-baremetal-operator/4.log" Mar 08 04:05:07.496205 master-0 kubenswrapper[13046]: I0308 04:05:07.495740 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-gwv4q_5f8cefbb-0ac8-4d0d-a923-7a863bd4d35b/baremetal-kube-rbac-proxy/0.log" Mar 08 04:05:07.513516 master-0 kubenswrapper[13046]: I0308 04:05:07.513448 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/1.log" Mar 08 04:05:07.520138 master-0 kubenswrapper[13046]: I0308 04:05:07.520100 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-gwnnd_5a2c9576-f7bd-4ac5-a7fe-530f26642f97/control-plane-machine-set-operator/2.log" Mar 08 04:05:07.549875 master-0 kubenswrapper[13046]: I0308 04:05:07.547403 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-qt654_c3729e29-4c57-4f9b-8202-a87fd3a9a722/kube-rbac-proxy/1.log" Mar 08 04:05:07.572879 master-0 kubenswrapper[13046]: I0308 04:05:07.572818 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-qt654_c3729e29-4c57-4f9b-8202-a87fd3a9a722/machine-api-operator/0.log" Mar 08 04:05:08.137882 master-0 kubenswrapper[13046]: I0308 04:05:08.137813 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" event={"ID":"165dfe64-d572-4cc6-ae49-89d976403a21","Type":"ContainerStarted","Data":"adb2a658ea784bd1661260eb52aeea27bfbc2880e1b829bc14b44b1ec4888884"} Mar 08 04:05:08.137882 master-0 kubenswrapper[13046]: I0308 04:05:08.137876 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" event={"ID":"165dfe64-d572-4cc6-ae49-89d976403a21","Type":"ContainerStarted","Data":"9a638d00c96df3535f876580df2e06b6aa7cfffee32fd7a52d1ac01ebd24c7ee"} Mar 08 04:05:08.139200 master-0 kubenswrapper[13046]: I0308 04:05:08.139157 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:08.164951 master-0 kubenswrapper[13046]: I0308 04:05:08.164884 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" podStartSLOduration=2.164466735 podStartE2EDuration="2.164466735s" podCreationTimestamp="2026-03-08 04:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 04:05:08.159021211 +0000 UTC m=+3110.237788418" watchObservedRunningTime="2026-03-08 04:05:08.164466735 +0000 UTC m=+3110.243233962" Mar 08 04:05:08.612591 master-0 kubenswrapper[13046]: I0308 04:05:08.612526 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-api-0_19902afd-81da-486c-96b3-27e0b46b2d38/cinder-e64dd-api-log/0.log" Mar 08 04:05:08.647317 master-0 kubenswrapper[13046]: I0308 04:05:08.646562 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-api-0_19902afd-81da-486c-96b3-27e0b46b2d38/cinder-api/0.log" Mar 08 04:05:08.734797 master-0 kubenswrapper[13046]: I0308 04:05:08.734738 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-backup-0_100689ad-dc43-494b-a7a2-f0351b969ab7/cinder-backup/0.log" Mar 08 04:05:08.756566 master-0 kubenswrapper[13046]: I0308 04:05:08.756517 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-backup-0_100689ad-dc43-494b-a7a2-f0351b969ab7/probe/0.log" Mar 08 04:05:08.831857 master-0 kubenswrapper[13046]: I0308 04:05:08.831813 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-scheduler-0_7e69d8ee-8a68-41e2-8af4-af4d00ec9af2/cinder-scheduler/0.log" Mar 08 04:05:08.859307 master-0 kubenswrapper[13046]: I0308 04:05:08.856594 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-scheduler-0_7e69d8ee-8a68-41e2-8af4-af4d00ec9af2/probe/0.log" Mar 08 04:05:08.939175 master-0 kubenswrapper[13046]: I0308 04:05:08.939064 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-volume-lvm-iscsi-0_bd0e3137-0784-440c-abba-948535a56e3b/cinder-volume/0.log" Mar 08 04:05:08.966102 master-0 kubenswrapper[13046]: I0308 04:05:08.966035 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-e64dd-volume-lvm-iscsi-0_bd0e3137-0784-440c-abba-948535a56e3b/probe/0.log" Mar 08 04:05:08.995810 master-0 kubenswrapper[13046]: I0308 04:05:08.995762 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74d79d4489-nq9k9_2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a/dnsmasq-dns/0.log" Mar 08 04:05:09.001756 master-0 kubenswrapper[13046]: I0308 04:05:09.001709 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-74d79d4489-nq9k9_2038b1aa-7c2f-4d34-878e-1c7caf3f3c7a/init/0.log" Mar 08 04:05:09.066750 master-0 kubenswrapper[13046]: I0308 04:05:09.066697 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-bf784-default-external-api-0_98c075f3-f193-415e-a94d-d5ee77a6738b/glance-log/0.log" Mar 08 04:05:09.084225 master-0 kubenswrapper[13046]: I0308 04:05:09.083779 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-bf784-default-external-api-0_98c075f3-f193-415e-a94d-d5ee77a6738b/glance-httpd/0.log" Mar 08 04:05:09.194577 master-0 kubenswrapper[13046]: I0308 04:05:09.192474 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-bf784-default-internal-api-0_3dd0b908-1f05-4fca-89a8-b6eb1f41c33d/glance-log/0.log" Mar 08 04:05:09.227756 master-0 kubenswrapper[13046]: I0308 04:05:09.227718 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-bf784-default-internal-api-0_3dd0b908-1f05-4fca-89a8-b6eb1f41c33d/glance-httpd/0.log" Mar 08 04:05:09.256265 master-0 kubenswrapper[13046]: I0308 04:05:09.255929 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-7859765cb5-ddvmn_25cc77c3-42af-4d5e-bd93-a0d0c7e07092/ironic-api-log/0.log" Mar 08 04:05:09.313518 master-0 kubenswrapper[13046]: I0308 04:05:09.313267 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7_d0931dbb-67f7-46d4-bc29-4dacdd9d1108/cluster-cloud-controller-manager/0.log" Mar 08 04:05:09.336437 master-0 kubenswrapper[13046]: I0308 04:05:09.330908 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-7859765cb5-ddvmn_25cc77c3-42af-4d5e-bd93-a0d0c7e07092/ironic-api/0.log" Mar 08 04:05:09.336437 master-0 kubenswrapper[13046]: I0308 04:05:09.333405 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7_d0931dbb-67f7-46d4-bc29-4dacdd9d1108/config-sync-controllers/0.log" Mar 08 04:05:09.340737 master-0 kubenswrapper[13046]: I0308 04:05:09.339941 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-7859765cb5-ddvmn_25cc77c3-42af-4d5e-bd93-a0d0c7e07092/init/0.log" Mar 08 04:05:09.374884 master-0 kubenswrapper[13046]: I0308 04:05:09.374586 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/ironic-conductor/0.log" Mar 08 04:05:09.375702 master-0 kubenswrapper[13046]: I0308 04:05:09.375674 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-cnhw7_d0931dbb-67f7-46d4-bc29-4dacdd9d1108/kube-rbac-proxy/0.log" Mar 08 04:05:09.390157 master-0 kubenswrapper[13046]: I0308 04:05:09.389663 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/httpboot/0.log" Mar 08 04:05:09.400997 master-0 kubenswrapper[13046]: I0308 04:05:09.400944 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/dnsmasq/0.log" Mar 08 04:05:09.410335 master-0 kubenswrapper[13046]: I0308 04:05:09.410302 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/init/0.log" Mar 08 04:05:09.420112 master-0 kubenswrapper[13046]: I0308 04:05:09.420071 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/ironic-python-agent-init/0.log" Mar 08 04:05:10.094619 master-0 kubenswrapper[13046]: I0308 04:05:10.094559 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_528b1064-a3b2-4ea4-8584-abeffdbedbbe/pxe-init/0.log" Mar 08 04:05:10.187503 master-0 kubenswrapper[13046]: I0308 04:05:10.187422 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/ironic-inspector-httpd/0.log" Mar 08 04:05:10.248412 master-0 kubenswrapper[13046]: I0308 04:05:10.248328 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/ironic-inspector/0.log" Mar 08 04:05:10.264929 master-0 kubenswrapper[13046]: I0308 04:05:10.263974 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/inspector-httpboot/0.log" Mar 08 04:05:10.289169 master-0 kubenswrapper[13046]: I0308 04:05:10.287195 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/ramdisk-logs/0.log" Mar 08 04:05:10.327623 master-0 kubenswrapper[13046]: I0308 04:05:10.327577 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/inspector-dnsmasq/0.log" Mar 08 04:05:10.335786 master-0 kubenswrapper[13046]: I0308 04:05:10.335319 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/ironic-python-agent-init/0.log" Mar 08 04:05:10.381304 master-0 kubenswrapper[13046]: I0308 04:05:10.380732 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_fa21680e-6931-4e05-afe2-bceebbb4389e/inspector-pxe-init/0.log" Mar 08 04:05:10.400102 master-0 kubenswrapper[13046]: I0308 04:05:10.400014 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-9f967cb96-7vpvw_d093b57a-247f-4d76-8ad2-659f459f5f1a/ironic-neutron-agent/2.log" Mar 08 04:05:10.404241 master-0 kubenswrapper[13046]: I0308 04:05:10.404144 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-9f967cb96-7vpvw_d093b57a-247f-4d76-8ad2-659f459f5f1a/ironic-neutron-agent/1.log" Mar 08 04:05:10.416226 master-0 kubenswrapper[13046]: I0308 04:05:10.416173 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29549041-ckb6v_4fc77ddd-0e59-4f30-96d2-380b1543777a/keystone-cron/0.log" Mar 08 04:05:10.528986 master-0 kubenswrapper[13046]: I0308 04:05:10.528104 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db8dcf7d7-ct9xk_98bf3b85-b910-4088-a5b4-8aa77f503535/keystone-api/0.log" Mar 08 04:05:11.875909 master-0 kubenswrapper[13046]: I0308 04:05:11.875858 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-rggnq_caa3a50c-1291-4152-a48a-f7c7b49627db/kube-rbac-proxy/0.log" Mar 08 04:05:11.926294 master-0 kubenswrapper[13046]: I0308 04:05:11.926239 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-rggnq_caa3a50c-1291-4152-a48a-f7c7b49627db/cloud-credential-operator/0.log" Mar 08 04:05:14.197580 master-0 kubenswrapper[13046]: I0308 04:05:14.194503 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/2.log" Mar 08 04:05:14.207211 master-0 kubenswrapper[13046]: I0308 04:05:14.207175 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-config-operator/3.log" Mar 08 04:05:14.230512 master-0 kubenswrapper[13046]: I0308 04:05:14.230006 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-zg4zr_3bf93333-b537-4f23-9c77-6a245b290fe3/openshift-api/0.log" Mar 08 04:05:15.499588 master-0 kubenswrapper[13046]: I0308 04:05:15.499469 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6c7fb6b958-flsjg_ec752a2e-4b18-4f4d-af88-19594345ae1c/console-operator/0.log" Mar 08 04:05:16.466133 master-0 kubenswrapper[13046]: I0308 04:05:16.466073 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f654c497d-dwcp2_42c1884e-96c1-46ee-a5dc-2267c7d84e2a/console/0.log" Mar 08 04:05:16.514456 master-0 kubenswrapper[13046]: I0308 04:05:16.511052 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-84f57b9877-h4zwb_9eb92440-4e70-4fa6-9315-444d6f99e287/download-server/0.log" Mar 08 04:05:16.572654 master-0 kubenswrapper[13046]: I0308 04:05:16.572612 13046 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h2xfr/perf-node-gather-daemonset-p9nkc" Mar 08 04:05:17.739111 master-0 kubenswrapper[13046]: I0308 04:05:17.739052 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-4qmzb_0e569889-4759-4046-b0ed-e550078521c6/cluster-storage-operator/0.log" Mar 08 04:05:17.750666 master-0 kubenswrapper[13046]: I0308 04:05:17.750471 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-4qmzb_0e569889-4759-4046-b0ed-e550078521c6/cluster-storage-operator/1.log" Mar 08 04:05:17.772799 master-0 kubenswrapper[13046]: I0308 04:05:17.772718 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/2.log" Mar 08 04:05:17.773021 master-0 kubenswrapper[13046]: I0308 04:05:17.772939 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-j6jpn_555ae3b4-71c6-4b62-9e09-66a58ae4c6ad/snapshot-controller/3.log" Mar 08 04:05:17.796315 master-0 kubenswrapper[13046]: I0308 04:05:17.796260 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-8fxl8_ba9496ed-060e-4118-9da6-89b82bd49263/csi-snapshot-controller-operator/1.log" Mar 08 04:05:17.805202 master-0 kubenswrapper[13046]: I0308 04:05:17.805152 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-8fxl8_ba9496ed-060e-4118-9da6-89b82bd49263/csi-snapshot-controller-operator/2.log" Mar 08 04:05:18.815198 master-0 kubenswrapper[13046]: I0308 04:05:18.815142 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-z45kw_8f99f81a-fd2d-432e-a3bc-e451342650b1/dns-operator/0.log" Mar 08 04:05:18.834875 master-0 kubenswrapper[13046]: I0308 04:05:18.834830 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-z45kw_8f99f81a-fd2d-432e-a3bc-e451342650b1/kube-rbac-proxy/0.log" Mar 08 04:05:19.782547 master-0 kubenswrapper[13046]: E0308 04:05:19.782317 13046 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:39026->192.168.32.10:33351: write tcp 192.168.32.10:39026->192.168.32.10:33351: write: broken pipe Mar 08 04:05:20.236872 master-0 kubenswrapper[13046]: I0308 04:05:20.236814 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-htnv4_bda3bd48-6de3-49b0-b2ce-96d97e97f178/dns/0.log" Mar 08 04:05:20.550533 master-0 kubenswrapper[13046]: I0308 04:05:20.550397 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-htnv4_bda3bd48-6de3-49b0-b2ce-96d97e97f178/kube-rbac-proxy/0.log" Mar 08 04:05:20.576637 master-0 kubenswrapper[13046]: I0308 04:05:20.575578 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lmqn7_dfe0357f-dab4-4424-869c-f6070b411a35/dns-node-resolver/0.log" Mar 08 04:05:21.684613 master-0 kubenswrapper[13046]: I0308 04:05:21.684541 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-gfmq4_8c0192f3-2e60-42c6-9836-c70a9fa407d5/etcd-operator/1.log" Mar 08 04:05:21.734315 master-0 kubenswrapper[13046]: I0308 04:05:21.734254 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-gfmq4_8c0192f3-2e60-42c6-9836-c70a9fa407d5/etcd-operator/2.log" Mar 08 04:05:22.264630 master-0 kubenswrapper[13046]: I0308 04:05:22.264469 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_96bb7670-d973-44e2-b9f5-887303acf725/memcached/0.log" Mar 08 04:05:22.423705 master-0 kubenswrapper[13046]: I0308 04:05:22.423644 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d8564b49-5mpt2_262f4148-a42c-44b0-b736-023696c55964/neutron-api/0.log" Mar 08 04:05:22.440163 master-0 kubenswrapper[13046]: I0308 04:05:22.440117 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d8564b49-5mpt2_262f4148-a42c-44b0-b736-023696c55964/neutron-httpd/0.log" Mar 08 04:05:22.537072 master-0 kubenswrapper[13046]: I0308 04:05:22.536965 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3d57373c-2f48-40da-baa7-611702e9ace5/nova-api-log/0.log" Mar 08 04:05:22.579401 master-0 kubenswrapper[13046]: I0308 04:05:22.577282 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 08 04:05:22.804885 master-0 kubenswrapper[13046]: I0308 04:05:22.804765 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3d57373c-2f48-40da-baa7-611702e9ace5/nova-api-api/0.log" Mar 08 04:05:22.886124 master-0 kubenswrapper[13046]: I0308 04:05:22.886069 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 08 04:05:22.931941 master-0 kubenswrapper[13046]: I0308 04:05:22.931899 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c16893d4-11ce-49ea-b9a9-be326e7887de/nova-cell0-conductor-conductor/0.log" Mar 08 04:05:22.934295 master-0 kubenswrapper[13046]: I0308 04:05:22.934278 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 08 04:05:22.948312 master-0 kubenswrapper[13046]: I0308 04:05:22.948247 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 08 04:05:22.967702 master-0 kubenswrapper[13046]: I0308 04:05:22.966640 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 08 04:05:22.979118 master-0 kubenswrapper[13046]: I0308 04:05:22.978890 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 08 04:05:23.003644 master-0 kubenswrapper[13046]: I0308 04:05:23.001764 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 08 04:05:23.016254 master-0 kubenswrapper[13046]: I0308 04:05:23.015314 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 08 04:05:23.026870 master-0 kubenswrapper[13046]: I0308 04:05:23.026614 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-compute-ironic-compute-0_63cbc85a-14cf-42d6-90a3-a6f4199557f9/nova-cell1-compute-ironic-compute-compute/0.log" Mar 08 04:05:23.033443 master-0 kubenswrapper[13046]: I0308 04:05:23.033399 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_2dc664e3-7f37-4fba-8104-544ffb18c1bd/installer/0.log" Mar 08 04:05:23.113245 master-0 kubenswrapper[13046]: I0308 04:05:23.113199 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_dbe1bc10-8da1-48fc-a9f0-089154ab30e3/installer/0.log" Mar 08 04:05:23.132769 master-0 kubenswrapper[13046]: I0308 04:05:23.132713 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_efc783b5-a9ba-494e-8e7e-3a1e26d4194c/nova-cell1-conductor-conductor/0.log" Mar 08 04:05:23.203451 master-0 kubenswrapper[13046]: I0308 04:05:23.203398 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_00cccaa9-e6cb-443c-ba0b-5477870e47be/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 04:05:23.306512 master-0 kubenswrapper[13046]: I0308 04:05:23.305475 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_52318bdd-862f-46ac-af96-9672cf810025/nova-metadata-log/0.log" Mar 08 04:05:23.918190 master-0 kubenswrapper[13046]: I0308 04:05:23.918128 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_52318bdd-862f-46ac-af96-9672cf810025/nova-metadata-metadata/0.log" Mar 08 04:05:24.024524 master-0 kubenswrapper[13046]: I0308 04:05:24.021474 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c6334b14-f49e-4a43-867a-2456a72324ab/nova-scheduler-scheduler/0.log" Mar 08 04:05:24.061834 master-0 kubenswrapper[13046]: I0308 04:05:24.061788 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1098c02e-9145-47f5-b794-cdc3f015a7b5/galera/0.log" Mar 08 04:05:24.078659 master-0 kubenswrapper[13046]: I0308 04:05:24.078472 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1098c02e-9145-47f5-b794-cdc3f015a7b5/mysql-bootstrap/0.log" Mar 08 04:05:24.110551 master-0 kubenswrapper[13046]: I0308 04:05:24.108991 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd/galera/0.log" Mar 08 04:05:24.124810 master-0 kubenswrapper[13046]: I0308 04:05:24.124774 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aeffd4c4-56b4-41c6-b701-0b0ad0ff63bd/mysql-bootstrap/0.log" Mar 08 04:05:24.135582 master-0 kubenswrapper[13046]: I0308 04:05:24.135545 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2736a83d-7185-4aad-af7a-9b36d243400d/openstackclient/0.log" Mar 08 04:05:24.211262 master-0 kubenswrapper[13046]: I0308 04:05:24.211167 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4wzhj_fe1b89c1-2d73-4551-81ea-a2ef4dd88b5c/ovn-controller/0.log" Mar 08 04:05:24.219142 master-0 kubenswrapper[13046]: I0308 04:05:24.219106 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sxqz2_0188fe7d-a015-4d70-b6ab-a001523d4ebd/openstack-network-exporter/0.log" Mar 08 04:05:24.229306 master-0 kubenswrapper[13046]: I0308 04:05:24.229279 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5d2gw_5b658945-9aef-47dc-8600-eb30f696cc3b/ovsdb-server/0.log" Mar 08 04:05:24.237537 master-0 kubenswrapper[13046]: I0308 04:05:24.237496 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5d2gw_5b658945-9aef-47dc-8600-eb30f696cc3b/ovs-vswitchd/0.log" Mar 08 04:05:24.245916 master-0 kubenswrapper[13046]: I0308 04:05:24.245832 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5d2gw_5b658945-9aef-47dc-8600-eb30f696cc3b/ovsdb-server-init/0.log" Mar 08 04:05:24.267195 master-0 kubenswrapper[13046]: I0308 04:05:24.267162 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47992c73-c637-40e5-955f-9738ece43dc5/ovn-northd/0.log" Mar 08 04:05:24.274526 master-0 kubenswrapper[13046]: I0308 04:05:24.274471 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47992c73-c637-40e5-955f-9738ece43dc5/openstack-network-exporter/0.log" Mar 08 04:05:24.298323 master-0 kubenswrapper[13046]: I0308 04:05:24.298245 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3533a834-99ca-4bb9-bc59-2c8eeb11a85e/ovsdbserver-nb/0.log" Mar 08 04:05:24.305182 master-0 kubenswrapper[13046]: I0308 04:05:24.305090 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3533a834-99ca-4bb9-bc59-2c8eeb11a85e/openstack-network-exporter/0.log" Mar 08 04:05:24.419236 master-0 kubenswrapper[13046]: I0308 04:05:24.419191 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c186da86-9a0c-48e2-a06a-babcc5d9e02c/ovsdbserver-sb/0.log" Mar 08 04:05:24.431764 master-0 kubenswrapper[13046]: I0308 04:05:24.431680 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c186da86-9a0c-48e2-a06a-babcc5d9e02c/openstack-network-exporter/0.log" Mar 08 04:05:24.547816 master-0 kubenswrapper[13046]: I0308 04:05:24.547285 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-86d6d77c7c-4vqgc_e74c8bb2-e063-4b60-b3fe-651aa534d029/cluster-image-registry-operator/0.log" Mar 08 04:05:24.572040 master-0 kubenswrapper[13046]: I0308 04:05:24.571949 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-86d6d77c7c-4vqgc_e74c8bb2-e063-4b60-b3fe-651aa534d029/cluster-image-registry-operator/1.log" Mar 08 04:05:24.596362 master-0 kubenswrapper[13046]: I0308 04:05:24.593783 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5b6ffc768d-hm56p_26d64446-1c0e-488a-b489-a05dfe5ad9a6/placement-log/0.log" Mar 08 04:05:24.596902 master-0 kubenswrapper[13046]: I0308 04:05:24.596880 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-85drn_b0119e83-0ee0-47e4-b591-6f2dc36073d2/node-ca/0.log" Mar 08 04:05:24.628909 master-0 kubenswrapper[13046]: I0308 04:05:24.628771 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5b6ffc768d-hm56p_26d64446-1c0e-488a-b489-a05dfe5ad9a6/placement-api/0.log" Mar 08 04:05:24.661891 master-0 kubenswrapper[13046]: I0308 04:05:24.661844 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5d102cd0-1a7f-4196-883c-bf2fd94fc7f2/rabbitmq/0.log" Mar 08 04:05:24.668092 master-0 kubenswrapper[13046]: I0308 04:05:24.668023 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5d102cd0-1a7f-4196-883c-bf2fd94fc7f2/setup-container/0.log" Mar 08 04:05:24.718531 master-0 kubenswrapper[13046]: I0308 04:05:24.718438 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53e48517-115a-43d0-ad79-a342efe0cf49/rabbitmq/0.log" Mar 08 04:05:24.732169 master-0 kubenswrapper[13046]: I0308 04:05:24.732136 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53e48517-115a-43d0-ad79-a342efe0cf49/setup-container/0.log" Mar 08 04:05:24.785269 master-0 kubenswrapper[13046]: I0308 04:05:24.785223 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-765dd9b47b-xzxwf_d6603220-12c5-4879-a72d-f12a27e7ed84/proxy-httpd/0.log" Mar 08 04:05:24.796232 master-0 kubenswrapper[13046]: I0308 04:05:24.796186 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-765dd9b47b-xzxwf_d6603220-12c5-4879-a72d-f12a27e7ed84/proxy-server/0.log" Mar 08 04:05:24.803760 master-0 kubenswrapper[13046]: I0308 04:05:24.803691 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tbd5b_f646560d-325d-41dc-ac99-a36f08ba0149/swift-ring-rebalance/0.log" Mar 08 04:05:24.828746 master-0 kubenswrapper[13046]: I0308 04:05:24.828707 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/account-server/0.log" Mar 08 04:05:24.861598 master-0 kubenswrapper[13046]: I0308 04:05:24.861559 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/account-replicator/0.log" Mar 08 04:05:24.871721 master-0 kubenswrapper[13046]: I0308 04:05:24.870362 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/account-auditor/0.log" Mar 08 04:05:24.882585 master-0 kubenswrapper[13046]: I0308 04:05:24.881428 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/account-reaper/0.log" Mar 08 04:05:24.894865 master-0 kubenswrapper[13046]: I0308 04:05:24.894817 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/container-server/0.log" Mar 08 04:05:24.915643 master-0 kubenswrapper[13046]: I0308 04:05:24.915605 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/container-replicator/0.log" Mar 08 04:05:24.921879 master-0 kubenswrapper[13046]: I0308 04:05:24.921843 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/container-auditor/0.log" Mar 08 04:05:24.935358 master-0 kubenswrapper[13046]: I0308 04:05:24.935293 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/container-updater/0.log" Mar 08 04:05:24.950737 master-0 kubenswrapper[13046]: I0308 04:05:24.950438 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/object-server/0.log" Mar 08 04:05:24.967622 master-0 kubenswrapper[13046]: I0308 04:05:24.967188 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/object-replicator/0.log" Mar 08 04:05:24.989865 master-0 kubenswrapper[13046]: I0308 04:05:24.989822 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/object-auditor/0.log" Mar 08 04:05:25.004474 master-0 kubenswrapper[13046]: I0308 04:05:25.004428 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/object-updater/0.log" Mar 08 04:05:25.022507 master-0 kubenswrapper[13046]: I0308 04:05:25.020086 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/object-expirer/0.log" Mar 08 04:05:25.030097 master-0 kubenswrapper[13046]: I0308 04:05:25.029699 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/rsync/0.log" Mar 08 04:05:25.036266 master-0 kubenswrapper[13046]: I0308 04:05:25.036240 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bdcf318b-3d3e-42da-ae4b-39c6a17f8437/swift-recon-cron/0.log" Mar 08 04:05:25.362604 master-0 kubenswrapper[13046]: I0308 04:05:25.362462 13046 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2xfr/master-0-debug-df247" event={"ID":"797d8166-b6bf-416b-97b5-59c98fdad8ba","Type":"ContainerStarted","Data":"de4c7104a8c7c49a625dd6d7d27a45fb387177b65b56c9bfeae0209b0db4ecb3"} Mar 08 04:05:25.382291 master-0 kubenswrapper[13046]: I0308 04:05:25.382221 13046 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2xfr/master-0-debug-df247" podStartSLOduration=1.455881478 podStartE2EDuration="19.38220628s" podCreationTimestamp="2026-03-08 04:05:06 +0000 UTC" firstStartedPulling="2026-03-08 04:05:07.029948305 +0000 UTC m=+3109.108715522" lastFinishedPulling="2026-03-08 04:05:24.956273107 +0000 UTC m=+3127.035040324" observedRunningTime="2026-03-08 04:05:25.376821978 +0000 UTC m=+3127.455589195" watchObservedRunningTime="2026-03-08 04:05:25.38220628 +0000 UTC m=+3127.460973497" Mar 08 04:05:25.536537 master-0 kubenswrapper[13046]: I0308 04:05:25.536462 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/1.log" Mar 08 04:05:25.548602 master-0 kubenswrapper[13046]: I0308 04:05:25.548553 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/ingress-operator/2.log" Mar 08 04:05:25.572813 master-0 kubenswrapper[13046]: I0308 04:05:25.572742 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-r9m2k_fd6b827c-70b0-47ed-b07c-c696343248a8/kube-rbac-proxy/0.log" Mar 08 04:05:26.545109 master-0 kubenswrapper[13046]: I0308 04:05:26.544996 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vfssr_7f273cc5-a290-421e-9ad9-b6f0db792fe2/serve-healthcheck-canary/0.log" Mar 08 04:05:27.303318 master-0 kubenswrapper[13046]: I0308 04:05:27.303265 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-zd6kq_b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/insights-operator/3.log" Mar 08 04:05:27.329148 master-0 kubenswrapper[13046]: I0308 04:05:27.329105 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-zd6kq_b33ed2de-435b-4ccc-8dfd-29d52bf95ea8/insights-operator/4.log" Mar 08 04:05:29.574055 master-0 kubenswrapper[13046]: I0308 04:05:29.574005 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/alertmanager/0.log" Mar 08 04:05:29.589028 master-0 kubenswrapper[13046]: I0308 04:05:29.588992 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/config-reloader/0.log" Mar 08 04:05:29.608689 master-0 kubenswrapper[13046]: I0308 04:05:29.608635 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/kube-rbac-proxy-web/0.log" Mar 08 04:05:29.620654 master-0 kubenswrapper[13046]: I0308 04:05:29.620576 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/kube-rbac-proxy/0.log" Mar 08 04:05:29.638219 master-0 kubenswrapper[13046]: I0308 04:05:29.638184 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/kube-rbac-proxy-metric/0.log" Mar 08 04:05:29.651255 master-0 kubenswrapper[13046]: I0308 04:05:29.651197 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/prom-label-proxy/0.log" Mar 08 04:05:29.668194 master-0 kubenswrapper[13046]: I0308 04:05:29.668139 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eb45ce29-bddc-46ac-a337-0ca37b46714e/init-config-reloader/0.log" Mar 08 04:05:29.745957 master-0 kubenswrapper[13046]: I0308 04:05:29.745900 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-674cbfbd9d-gj775_4108f513-acef-473a-ab03-f3761b2bd0d8/cluster-monitoring-operator/0.log" Mar 08 04:05:29.761068 master-0 kubenswrapper[13046]: I0308 04:05:29.761009 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-t5lc5_2ea520cd-5fd4-4354-8cbb-38539cbef506/kube-state-metrics/0.log" Mar 08 04:05:29.775860 master-0 kubenswrapper[13046]: I0308 04:05:29.775820 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-t5lc5_2ea520cd-5fd4-4354-8cbb-38539cbef506/kube-rbac-proxy-main/0.log" Mar 08 04:05:29.787047 master-0 kubenswrapper[13046]: I0308 04:05:29.786995 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-t5lc5_2ea520cd-5fd4-4354-8cbb-38539cbef506/kube-rbac-proxy-self/0.log" Mar 08 04:05:29.804538 master-0 kubenswrapper[13046]: I0308 04:05:29.804490 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-766946c477-ghtbv_1893fc9c-7c29-4674-8011-f046dd63a08b/metrics-server/0.log" Mar 08 04:05:29.820063 master-0 kubenswrapper[13046]: I0308 04:05:29.819973 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-d9c677fbc-vwt7f_00a324ea-209d-4b0c-86af-3058436a291a/monitoring-plugin/0.log" Mar 08 04:05:29.848174 master-0 kubenswrapper[13046]: I0308 04:05:29.848127 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rtdkr_23651d40-89da-46c4-a6cb-b4c031e826cb/node-exporter/0.log" Mar 08 04:05:29.858699 master-0 kubenswrapper[13046]: I0308 04:05:29.858613 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rtdkr_23651d40-89da-46c4-a6cb-b4c031e826cb/kube-rbac-proxy/0.log" Mar 08 04:05:29.869456 master-0 kubenswrapper[13046]: I0308 04:05:29.869414 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rtdkr_23651d40-89da-46c4-a6cb-b4c031e826cb/init-textfile/0.log" Mar 08 04:05:29.886568 master-0 kubenswrapper[13046]: I0308 04:05:29.886521 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-mcvbt_268919d0-afa6-48ed-a6cb-3f558fc78b5d/kube-rbac-proxy-main/0.log" Mar 08 04:05:29.901265 master-0 kubenswrapper[13046]: I0308 04:05:29.901223 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-mcvbt_268919d0-afa6-48ed-a6cb-3f558fc78b5d/kube-rbac-proxy-self/0.log" Mar 08 04:05:29.913203 master-0 kubenswrapper[13046]: I0308 04:05:29.913152 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-mcvbt_268919d0-afa6-48ed-a6cb-3f558fc78b5d/openshift-state-metrics/0.log" Mar 08 04:05:29.949894 master-0 kubenswrapper[13046]: I0308 04:05:29.949861 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/prometheus/0.log" Mar 08 04:05:29.960582 master-0 kubenswrapper[13046]: I0308 04:05:29.960560 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/config-reloader/0.log" Mar 08 04:05:29.986506 master-0 kubenswrapper[13046]: I0308 04:05:29.982951 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/thanos-sidecar/0.log" Mar 08 04:05:30.009984 master-0 kubenswrapper[13046]: I0308 04:05:30.009952 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/kube-rbac-proxy-web/0.log" Mar 08 04:05:30.025317 master-0 kubenswrapper[13046]: I0308 04:05:30.025272 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/kube-rbac-proxy/0.log" Mar 08 04:05:30.045544 master-0 kubenswrapper[13046]: I0308 04:05:30.045466 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/kube-rbac-proxy-thanos/0.log" Mar 08 04:05:30.059756 master-0 kubenswrapper[13046]: I0308 04:05:30.059711 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77985828-fe36-4c71-afe2-3b0a69f6220b/init-config-reloader/0.log" Mar 08 04:05:30.080640 master-0 kubenswrapper[13046]: I0308 04:05:30.080326 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-ck44k_08259d7a-3093-4f7d-b1ef-04f0f954e986/prometheus-operator/0.log" Mar 08 04:05:30.091831 master-0 kubenswrapper[13046]: I0308 04:05:30.091786 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-ck44k_08259d7a-3093-4f7d-b1ef-04f0f954e986/kube-rbac-proxy/0.log" Mar 08 04:05:30.108004 master-0 kubenswrapper[13046]: I0308 04:05:30.107901 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8464df8497-2zr24_e96e2bdd-2b4f-45c9-8db0-4b910d86d62d/prometheus-operator-admission-webhook/0.log" Mar 08 04:05:30.146900 master-0 kubenswrapper[13046]: I0308 04:05:30.146791 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77897b758b-4ff46_2f3aa698-2f96-4668-94ff-f287305790c7/telemeter-client/0.log" Mar 08 04:05:30.157990 master-0 kubenswrapper[13046]: I0308 04:05:30.157933 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77897b758b-4ff46_2f3aa698-2f96-4668-94ff-f287305790c7/reload/0.log" Mar 08 04:05:30.171805 master-0 kubenswrapper[13046]: I0308 04:05:30.171751 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-77897b758b-4ff46_2f3aa698-2f96-4668-94ff-f287305790c7/kube-rbac-proxy/0.log" Mar 08 04:05:30.192275 master-0 kubenswrapper[13046]: I0308 04:05:30.192231 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/thanos-query/0.log" Mar 08 04:05:30.206018 master-0 kubenswrapper[13046]: I0308 04:05:30.205971 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/kube-rbac-proxy-web/0.log" Mar 08 04:05:30.220743 master-0 kubenswrapper[13046]: I0308 04:05:30.220600 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/kube-rbac-proxy/0.log" Mar 08 04:05:30.239368 master-0 kubenswrapper[13046]: I0308 04:05:30.239317 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/prom-label-proxy/0.log" Mar 08 04:05:30.254740 master-0 kubenswrapper[13046]: I0308 04:05:30.254663 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/kube-rbac-proxy-rules/0.log" Mar 08 04:05:30.265667 master-0 kubenswrapper[13046]: I0308 04:05:30.265615 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-688d9d49f9-nnqwz_b3311dbb-cd30-4cd9-9f18-d360521bec39/kube-rbac-proxy-metrics/0.log" Mar 08 04:05:32.393893 master-0 kubenswrapper[13046]: I0308 04:05:32.393825 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/controller/0.log" Mar 08 04:05:32.404358 master-0 kubenswrapper[13046]: I0308 04:05:32.404081 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/kube-rbac-proxy/0.log" Mar 08 04:05:32.424863 master-0 kubenswrapper[13046]: I0308 04:05:32.424808 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/controller/0.log" Mar 08 04:05:33.477212 master-0 kubenswrapper[13046]: I0308 04:05:33.477159 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/controller/0.log" Mar 08 04:05:33.487369 master-0 kubenswrapper[13046]: I0308 04:05:33.487306 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-n8p77_d8655289-e199-48db-be5c-78f68514a515/kube-rbac-proxy/0.log" Mar 08 04:05:33.503712 master-0 kubenswrapper[13046]: I0308 04:05:33.503671 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr/0.log" Mar 08 04:05:33.507305 master-0 kubenswrapper[13046]: I0308 04:05:33.507256 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/controller/0.log" Mar 08 04:05:33.522198 master-0 kubenswrapper[13046]: I0308 04:05:33.522165 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/reloader/0.log" Mar 08 04:05:33.534051 master-0 kubenswrapper[13046]: I0308 04:05:33.534030 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr-metrics/0.log" Mar 08 04:05:33.549751 master-0 kubenswrapper[13046]: I0308 04:05:33.549709 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy/0.log" Mar 08 04:05:33.561019 master-0 kubenswrapper[13046]: I0308 04:05:33.560980 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy-frr/0.log" Mar 08 04:05:33.575237 master-0 kubenswrapper[13046]: I0308 04:05:33.575194 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-frr-files/0.log" Mar 08 04:05:33.588795 master-0 kubenswrapper[13046]: I0308 04:05:33.588747 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-reloader/0.log" Mar 08 04:05:33.603836 master-0 kubenswrapper[13046]: I0308 04:05:33.603795 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-metrics/0.log" Mar 08 04:05:33.623175 master-0 kubenswrapper[13046]: I0308 04:05:33.623128 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-546wf_2b5bd505-a6ef-490d-b7b4-83412df76a4f/frr-k8s-webhook-server/0.log" Mar 08 04:05:33.662801 master-0 kubenswrapper[13046]: I0308 04:05:33.662762 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68cfc6845d-mhm6t_e1d15c8d-0326-4e12-bdba-ed6df8b88ed0/manager/0.log" Mar 08 04:05:33.681234 master-0 kubenswrapper[13046]: I0308 04:05:33.680894 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fb7496f9c-6jvkl_738a1cd1-8f37-4d94-abeb-36e19b8653b3/webhook-server/0.log" Mar 08 04:05:34.197552 master-0 kubenswrapper[13046]: I0308 04:05:34.197503 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/speaker/0.log" Mar 08 04:05:34.215737 master-0 kubenswrapper[13046]: I0308 04:05:34.215687 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/kube-rbac-proxy/0.log" Mar 08 04:05:34.492788 master-0 kubenswrapper[13046]: I0308 04:05:34.492734 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7_6b60fab9-7af7-4945-be33-d495f643467c/extract/0.log" Mar 08 04:05:34.509118 master-0 kubenswrapper[13046]: I0308 04:05:34.509058 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7_6b60fab9-7af7-4945-be33-d495f643467c/util/0.log" Mar 08 04:05:34.542518 master-0 kubenswrapper[13046]: I0308 04:05:34.542442 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f997p8s7_6b60fab9-7af7-4945-be33-d495f643467c/pull/0.log" Mar 08 04:05:34.703452 master-0 kubenswrapper[13046]: I0308 04:05:34.703404 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr/0.log" Mar 08 04:05:34.720814 master-0 kubenswrapper[13046]: I0308 04:05:34.720766 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/reloader/0.log" Mar 08 04:05:34.726670 master-0 kubenswrapper[13046]: I0308 04:05:34.726641 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/frr-metrics/0.log" Mar 08 04:05:34.738904 master-0 kubenswrapper[13046]: I0308 04:05:34.738840 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy/0.log" Mar 08 04:05:34.762199 master-0 kubenswrapper[13046]: I0308 04:05:34.762073 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/kube-rbac-proxy-frr/0.log" Mar 08 04:05:34.773985 master-0 kubenswrapper[13046]: I0308 04:05:34.773940 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-frr-files/0.log" Mar 08 04:05:34.786449 master-0 kubenswrapper[13046]: I0308 04:05:34.786344 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-reloader/0.log" Mar 08 04:05:34.802129 master-0 kubenswrapper[13046]: I0308 04:05:34.802089 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7jbsp_202b3558-b98e-401f-9c22-529f5a27dd5b/cp-metrics/0.log" Mar 08 04:05:34.820740 master-0 kubenswrapper[13046]: I0308 04:05:34.820704 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-546wf_2b5bd505-a6ef-490d-b7b4-83412df76a4f/frr-k8s-webhook-server/0.log" Mar 08 04:05:34.861636 master-0 kubenswrapper[13046]: I0308 04:05:34.861580 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68cfc6845d-mhm6t_e1d15c8d-0326-4e12-bdba-ed6df8b88ed0/manager/0.log" Mar 08 04:05:34.877435 master-0 kubenswrapper[13046]: I0308 04:05:34.875405 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7fb7496f9c-6jvkl_738a1cd1-8f37-4d94-abeb-36e19b8653b3/webhook-server/0.log" Mar 08 04:05:35.392946 master-0 kubenswrapper[13046]: I0308 04:05:35.391698 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/speaker/0.log" Mar 08 04:05:35.409682 master-0 kubenswrapper[13046]: I0308 04:05:35.409641 13046 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8zzn_72cc246d-ba12-4435-90fb-e8a0c307bb48/kube-rbac-proxy/0.log"